Results by Prompt Templates
What prompt you use can sometimes matter more than the model. Here we compare the performance of various prompt templates in PromptingTools.jl package.
Reminder: The below scores are on a scale 0-100, where 100 is the best possible score and 0 means the generated code was not even parseable.
# Imports
using JuliaLLMLeaderboard
using CairoMakie, AlgebraOfGraphics
using MarkdownTables, DataFramesMeta
using Statistics: mean, median, quantile;
unscrub_string(s::AbstractString) = split(s, "_") .|> titlecase |> x -> join(x, " ");
# ! Configuration
SAVE_PLOTS = false
DIR_RESULTS = joinpath(pkgdir(JuliaLLMLeaderboard), "code_generation")
PAID_MODELS_DEFAULT = [
"gpt-3.5-turbo",
"gpt-3.5-turbo-1106",
"gpt-3.5-turbo-0125",
"gpt-4-1106-preview",
"gpt-4-0125-preview",
"mistral-tiny",
"mistral-small",
"mistral-medium",
"gemini-1.0-pro-latest",
];
PROMPTS = [
"JuliaExpertCoTTask",
"JuliaExpertAsk",
"InJulia",
"JuliaRecapTask",
"JuliaRecapCoTTask",
];Load Results
Use only the 5 most recent evaluations available for each definition/model/prompt
df = @chain begin
load_evals(DIR_RESULTS; max_history = 5)
# remove qwen models as they are not correct!
@rsubset !occursin("qwen", :model)
end| Row | device | name | model | prompt_label | prompt_strategy | parsed | executed | unit_tests_count | timestamp | unit_tests_passed | experiment | cost | elapsed_seconds | examples_executed | tokens | version_pt | examples_count | version_prompt | parameters | schema | filename | score | temperature | options | top_p |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| String | String | String | String | String | Bool | Bool | Int64 | String | Int64 | String | Float64 | Float64 | Int64 | Array… | String | Int64 | String | Object… | String | String | Float64 | Float64? | Object…? | Float64? | |
| 1 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | AsIs | 1SHOT | false | false | 4 | 20231213_230820__566 | 0 | 0.0 | 13.5575 | 0 | [66, 402] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231213_230820__566.json | 0.0 | missing | missing | missing | |
| 2 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | AsIs | 1SHOT | false | false | 4 | 20231224_215740__786 | 0 | 0.0 | 13.8731 | 0 | [88, 249] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231224_215740__786.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | AsIs | 1SHOT | false | false | 4 | 20231224_215751__149 | 0 | 0.0 | 10.9587 | 0 | [88, 194] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231224_215751__149.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | InJulia | 1SHOT | true | true | 4 | 20231213_230806__193 | 0 | 0.0 | 8.65479 | 0 | [82, 255] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231213_230806__193.json | 50.0 | missing | missing | missing | |
| 5 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | InJulia | 1SHOT | true | true | 4 | 20231224_215713__453 | 0 | 0.0 | 16.5864 | 0 | [90, 299] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231224_215713__453.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | InJulia | 1SHOT | true | true | 4 | 20231224_215726__570 | 0 | 0.0 | 12.339 | 0 | [90, 220] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231224_215726__570.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | InJulia | 1SHOT | true | true | 4 | 20231226_205306__662 | 0 | 0.0 | 16.6737 | 0 | [90, 302] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231226_205306__662.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231213_230757__380 | 0 | 0.0 | 7.88754 | 0 | [112, 222] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231213_230757__380.json | 50.0 | missing | missing | missing | |
| 9 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_215643__493 | 1 | 0.0 | 9.52042 | 3 | [129, 157] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231224_215643__493.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_215657__736 | 0 | 0.0 | 12.8782 | 0 | [129, 220] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231224_215657__736.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231226_205249__838 | 1 | 0.0 | 7.94984 | 1 | [129, 128] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231226_205249__838.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231213_230749__984 | 0 | 0.0 | 22.7754 | 0 | [239, 589] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231213_230749__984.json | 50.0 | missing | missing | missing | |
| 13 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231224_215629__136 | 0 | 0.0 | 15.2446 | 0 | [257, 52] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231224_215629__136.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 14 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231224_215634__384 | 0 | 0.0 | 4.59877 | 0 | [257, 43] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231224_215634__384.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 15 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231226_205241__762 | 0 | 0.0 | 21.5078 | 0 | [257, 181] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231226_205241__762.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 16 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231213_230900__251 | 0 | 0.0 | 17.932 | 0 | [11, 487] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231213_230900__251.json | 50.0 | missing | missing | missing | |
| 17 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_215850__983 | 1 | 0.0 | 20.522 | 1 | [394, 311] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231224_215850__983.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 18 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231224_215919__856 | 0 | 0.0 | 28.4554 | 0 | [394, 449] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231224_215919__856.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 19 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231226_205426__535 | 0 | 0.0 | 23.2939 | 0 | [394, 361] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231226_205426__535.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 20 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaRecapTask | 1SHOT | false | false | 4 | 20231213_230842__931 | 0 | 0.0 | 22.6544 | 0 | [383, 525] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231213_230842__931.json | 0.0 | missing | missing | missing | |
| 21 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_215809__525 | 0 | 0.0 | 18.1071 | 0 | [391, 268] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231224_215809__525.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 22 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_215830__702 | 0 | 0.0 | 20.8176 | 0 | [391, 316] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231224_215830__702.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 23 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 4 | 20231226_205402__392 | 4 | 0.0 | 55.6299 | 4 | [391, 904] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231226_205402__392.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 24 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20240131_200037__720 | 0 | 0.0 | 3.62831 | 0 | [0, 279] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_200037__720.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 25 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20240131_200041__856 | 0 | 0.0 | 4.12848 | 0 | [0, 317] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_200041__856.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 26 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 4 | 20240131_200044__545 | 0 | 0.0 | 3.0393 | 0 | [0, 234] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_200044__545.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 27 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20240131_200046__282 | 1 | 0.0 | 1.70754 | 3 | [0, 132] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_200046__282.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 28 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20240131_200048__342 | 0 | 0.0 | 2.09685 | 0 | [0, 162] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_200048__342.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 29 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240131_200009__213 | 0 | 0.0 | 2.41936 | 0 | [0, 186] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_200009__213.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 30 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20240131_200010__835 | 0 | 0.0 | 0.675795 | 0 | [0, 52] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_200010__835.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 31 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20240131_200011__352 | 0 | 0.0 | 0.675473 | 0 | [0, 52] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_200011__352.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 32 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240131_200012__701 | 0 | 0.0 | 0.791657 | 0 | [0, 61] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_200012__701.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 33 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240131_200016__900 | 1 | 0.0 | 2.08184 | 1 | [0, 160] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_200016__900.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 34 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20240131_195954__675 | 0 | 0.0 | 0.59123 | 0 | [0, 45] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_195954__675.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 35 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20240131_195956__962 | 0 | 0.0 | 2.03611 | 0 | [0, 155] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_195956__962.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 36 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20240131_195958__542 | 1 | 0.0 | 2.4445 | 4 | [0, 186] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_195958__542.json | 81.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 37 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20240131_200000__154 | 1 | 0.0 | 1.98211 | 1 | [0, 151] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_200000__154.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 38 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20240131_200001__298 | 0 | 0.0 | 0.775267 | 0 | [0, 59] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_200001__298.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 39 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240131_200158__318 | 4 | 0.0 | 2.41592 | 4 | [0, 181] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_200158__318.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 40 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20240131_200200__669 | 0 | 0.0 | 1.8915 | 0 | [0, 142] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_200200__669.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 41 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240131_200203__634 | 4 | 0.0 | 3.51844 | 4 | [0, 263] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_200203__634.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 42 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240131_200206__692 | 0 | 0.0 | 2.10539 | 0 | [0, 158] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_200206__692.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 43 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240131_200209__305 | 1 | 0.0 | 3.35662 | 4 | [0, 251] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_200209__305.json | 81.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 44 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20240131_200115__858 | 1 | 0.0 | 8.21797 | 4 | [0, 605] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_200115__858.json | 81.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 45 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20240131_200121__443 | 0 | 0.0 | 4.88334 | 0 | [0, 363] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_200121__443.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 46 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20240131_200124__160 | 1 | 0.0 | 2.77828 | 1 | [0, 208] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_200124__160.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 47 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20240131_200126__853 | 4 | 0.0 | 2.53494 | 4 | [0, 190] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_200126__853.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 48 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20240131_200130__774 | 4 | 0.0 | 3.5596 | 4 | [0, 266] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_200130__774.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 49 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | AsIs | 1SHOT | false | false | 4 | 20231213_231000__691 | 0 | 0.0 | 13.974 | 0 | [66, 414] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__AsIs__1SHOT__20231213_231000__691.json | 0.0 | missing | missing | missing | |
| 50 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | AsIs | 1SHOT | false | false | 4 | 20231224_220100__228 | 0 | 0.0 | 40.705 | 0 | [61, 739] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__AsIs__1SHOT__20231224_220100__228.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 51 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | AsIs | 1SHOT | false | false | 4 | 20231224_220114__884 | 0 | 0.0 | 14.2007 | 0 | [61, 261] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__AsIs__1SHOT__20231224_220114__884.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 52 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | InJulia | 1SHOT | true | true | 4 | 20231213_230946__496 | 0 | 0.0 | 17.0295 | 0 | [82, 499] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__InJulia__1SHOT__20231213_230946__496.json | 50.0 | missing | missing | missing | |
| 53 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | InJulia | 1SHOT | false | false | 4 | 20231224_220016__286 | 0 | 0.0 | 12.569 | 0 | [64, 230] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__InJulia__1SHOT__20231224_220016__286.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 54 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | InJulia | 1SHOT | false | false | 4 | 20231224_220019__990 | 0 | 0.0 | 3.50014 | 0 | [64, 57] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__InJulia__1SHOT__20231224_220019__990.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 55 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231213_230929__136 | 0 | 0.0 | 7.67115 | 0 | [112, 215] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231213_230929__136.json | 25.0 | missing | missing | missing | |
| 56 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231224_215950__869 | 0 | 0.0 | 9.74012 | 0 | [66, 172] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231224_215950__869.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 57 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231224_220003__975 | 0 | 0.0 | 13.2863 | 0 | [66, 239] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231224_220003__975.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 58 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231213_230921__834 | 0 | 0.0 | 20.6148 | 0 | [239, 530] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231213_230921__834.json | 50.0 | missing | missing | missing | |
| 59 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231224_215937__260 | 0 | 0.0 | 18.0286 | 0 | [132, 126] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231224_215937__260.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 60 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231224_215940__811 | 0 | 0.0 | 3.16863 | 0 | [132, 36] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231224_215940__811.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 61 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231213_231040__282 | 0 | 0.0 | 17.9712 | 0 | [11, 488] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231213_231040__282.json | 50.0 | missing | missing | missing | |
| 62 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231224_220130__602 | 0 | 0.0 | 1.38221 | 0 | [83, 11] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231224_220130__602.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 63 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231224_220132__819 | 0 | 0.0 | 2.63389 | 0 | [83, 35] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231224_220132__819.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 64 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 4 | 20231213_231022__289 | 0 | 0.0 | 22.288 | 0 | [383, 515] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231213_231022__289.json | 0.0 | missing | missing | missing | |
| 65 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 4 | 20231224_220118__216 | 0 | 0.0 | 3.19781 | 0 | [80, 46] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231224_220118__216.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 66 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 4 | 20231224_220128__849 | 0 | 0.0 | 10.8613 | 0 | [80, 193] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231224_220128__849.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 67 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20240131_200632__290 | 0 | 0.0 | 11.7137 | 0 | [0, 426] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_200632__290.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 68 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20240131_200644__930 | 1 | 0.0 | 11.5195 | 4 | [0, 419] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_200644__930.json | 81.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 69 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20240131_200652__369 | 0 | 0.0 | 8.30975 | 0 | [0, 303] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_200652__369.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 70 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20240131_200657__379 | 0 | 0.0 | 4.57773 | 0 | [0, 167] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_200657__379.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 71 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 4 | 20240131_200704__491 | 0 | 0.0 | 7.48948 | 0 | [0, 273] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_200704__491.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 72 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240131_200541__773 | 1 | 0.0 | 2.03048 | 3 | [0, 74] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_200541__773.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 73 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 4 | 20240131_200548__544 | 0 | 0.0 | 7.45249 | 0 | [0, 271] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_200548__544.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 74 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240131_200553__585 | 1 | 0.0 | 5.40922 | 1 | [0, 197] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_200553__585.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 75 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240131_200555__776 | 1 | 0.0 | 1.70079 | 4 | [0, 62] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_200555__776.json | 81.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 76 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20240131_200600__517 | 0 | 0.0 | 4.24535 | 0 | [0, 155] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_200600__517.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 77 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20240131_200441__932 | 0 | 0.0 | 4.7203 | 0 | [0, 171] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_200441__932.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 78 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20240131_200443__669 | 0 | 0.0 | 1.49066 | 0 | [0, 54] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_200443__669.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 79 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20240131_200455__137 | 0 | 0.0 | 12.7932 | 0 | [0, 460] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_200455__137.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 80 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20240131_200504__980 | 0 | 0.0 | 8.3367 | 0 | [0, 301] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_200504__980.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 81 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20240131_200511__788 | 0 | 0.0 | 7.41211 | 0 | [0, 268] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_200511__788.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 82 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20240131_200915__927 | 0 | 0.0 | 4.89063 | 0 | [0, 176] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_200915__927.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 83 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240131_200930__581 | 1 | 0.0 | 15.4431 | 3 | [0, 552] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_200930__581.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 84 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240131_200949__649 | 1 | 0.0 | 18.4857 | 1 | [0, 660] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_200949__649.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 85 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20240131_200951__204 | 0 | 0.0 | 1.69633 | 0 | [0, 61] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_200951__204.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 86 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240131_201009__237 | 0 | 0.0 | 18.3126 | 0 | [0, 654] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_201009__237.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 87 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20240131_200748__450 | 1 | 0.0 | 7.13106 | 4 | [0, 256] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_200748__450.json | 81.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 88 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20240131_200803__133 | 1 | 0.0 | 15.2005 | 3 | [0, 543] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_200803__133.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 89 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20240131_200815__798 | 4 | 0.0 | 11.4399 | 2 | [0, 409] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_200815__798.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 90 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20240131_200826__199 | 0 | 0.0 | 10.8975 | 0 | [0, 390] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_200826__199.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 91 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20240131_200834__305 | 1 | 0.0 | 8.76437 | 1 | [0, 314] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_200834__305.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 92 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 4 | 20240131_195546__567 | 1 | 0.0 | 4.65735 | 3 | [96, 107] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240131_195546__567.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 93 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 4 | 20240131_195554__411 | 0 | 0.0 | 7.92804 | 0 | [0, 197] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240131_195554__411.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 94 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 4 | 20240131_195615__571 | 0 | 0.0 | 20.7528 | 0 | [0, 512] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240131_195615__571.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 95 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 4 | 20240131_195624__501 | 0 | 0.0 | 8.82863 | 0 | [0, 219] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240131_195624__501.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 96 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 4 | 20240131_195628__459 | 1 | 0.0 | 4.25507 | 3 | [0, 106] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240131_195628__459.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 97 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 4 | 20240131_194948__717 | 0 | 0.0 | 7.00596 | 0 | [0, 174] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_194948__717.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 98 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240131_194955__311 | 0 | 0.0 | 6.87766 | 0 | [0, 171] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_194955__311.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 99 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 4 | 20240131_195003__574 | 0 | 0.0 | 7.64486 | 0 | [0, 190] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_195003__574.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 100 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240131_195009__918 | 1 | 0.0 | 5.94625 | 1 | [0, 148] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_195009__918.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 101 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 4 | 20240131_195016__470 | 0 | 0.0 | 7.45146 | 0 | [0, 185] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_195016__470.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 102 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20240131_194825__379 | 0 | 0.0 | 8.17644 | 0 | [0, 202] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_194825__379.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 103 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20240131_194837__788 | 4 | 0.0 | 12.5914 | 4 | [0, 310] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_194837__788.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 104 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20240131_194841__251 | 0 | 0.0 | 3.39725 | 0 | [0, 84] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_194841__251.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 105 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20240131_194852__713 | 0 | 0.0 | 11.1555 | 0 | [0, 275] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_194852__713.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 106 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20240131_194856__588 | 1 | 0.0 | 4.20616 | 3 | [0, 104] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_194856__588.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 107 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20240131_195855__356 | 0 | 0.0 | 10.0093 | 0 | [0, 245] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240131_195855__356.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 108 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240131_195907__269 | 1 | 0.0 | 12.1109 | 3 | [0, 296] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240131_195907__269.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 109 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240131_195916__494 | 1 | 0.0 | 9.35382 | 4 | [0, 229] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240131_195916__494.json | 81.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 110 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20240131_195934__349 | 0 | 0.0 | 17.6827 | 0 | [0, 431] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240131_195934__349.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 111 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20240131_195938__111 | 0 | 0.0 | 3.98413 | 0 | [0, 98] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240131_195938__111.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 112 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 4 | 20240131_195729__355 | 0 | 0.0 | 4.29962 | 0 | [0, 106] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240131_195729__355.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 113 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 4 | 20240131_195735__611 | 0 | 0.0 | 5.56805 | 0 | [0, 137] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240131_195735__611.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 114 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 4 | 20240131_195740__258 | 0 | 0.0 | 4.0996 | 0 | [0, 101] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240131_195740__258.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 115 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 4 | 20240131_195754__940 | 1 | 0.0 | 14.255 | 1 | [0, 348] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240131_195754__940.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 116 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 4 | 20240131_195811__616 | 0 | 0.0 | 16.5586 | 0 | [0, 404] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240131_195811__616.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 117 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20240131_194052__514 | 1 | 0.0 | 7.51569 | 3 | [0, 142] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_194052__514.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 118 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 4 | 20240131_194059__305 | 0 | 0.0 | 7.14506 | 0 | [0, 135] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_194059__305.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 119 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 4 | 20240131_194115__248 | 0 | 0.0 | 15.1208 | 0 | [0, 285] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_194115__248.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 120 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 4 | 20240131_194139__839 | 0 | 0.0 | 24.8192 | 0 | [0, 466] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_194139__839.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 121 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 4 | 20240131_194207__286 | 0 | 0.0 | 27.6215 | 0 | [0, 518] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_194207__286.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 122 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20240131_193849__142 | 0 | 0.0 | 25.8302 | 0 | [0, 484] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_193849__142.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 123 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240131_193904__125 | 1 | 0.0 | 14.4532 | 3 | [0, 272] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_193904__125.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 124 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20240131_193916__685 | 0 | 0.0 | 12.2775 | 0 | [0, 231] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_193916__685.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 125 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240131_193927__127 | 0 | 0.0 | 10.7201 | 0 | [0, 202] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_193927__127.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 126 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20240131_193939__825 | 0 | 0.0 | 11.4102 | 0 | [0, 215] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_193939__825.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 127 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20240131_193601__955 | 0 | 0.0 | 19.0559 | 0 | [0, 356] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_193601__955.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 128 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20240131_193614__622 | 0 | 0.0 | 12.1635 | 0 | [0, 228] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_193614__622.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 129 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20240131_193641__253 | 1 | 0.0 | 27.8219 | 4 | [0, 518] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_193641__253.json | 81.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 130 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20240131_193655__997 | 0 | 0.0 | 13.0668 | 0 | [0, 245] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_193655__997.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 131 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20240131_193709__386 | 0 | 0.0 | 13.2898 | 0 | [0, 249] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_193709__386.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 132 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240131_194605__487 | 1 | 0.0 | 15.5439 | 4 | [0, 289] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_194605__487.json | 81.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 133 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20240131_194616__610 | 0 | 0.0 | 11.7083 | 0 | [0, 218] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_194616__610.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 134 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20240131_194641__310 | 0 | 0.0 | 24.1412 | 0 | [0, 448] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_194641__310.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 135 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240131_194652__478 | 4 | 0.0 | 11.2163 | 4 | [0, 209] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_194652__478.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 136 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20240131_194701__885 | 0 | 0.0 | 9.1083 | 0 | [0, 170] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_194701__885.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 137 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 4 | 20240131_194357__672 | 0 | 0.0 | 14.3923 | 0 | [0, 268] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_194357__672.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 138 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 4 | 20240131_194401__225 | 0 | 0.0 | 3.79091 | 0 | [0, 71] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_194401__225.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 139 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 4 | 20240131_194407__457 | 0 | 0.0 | 6.19336 | 0 | [0, 116] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_194407__457.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 140 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20240131_194432__512 | 4 | 0.0 | 25.0987 | 4 | [0, 466] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_194432__512.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 141 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 4 | 20240131_194441__939 | 0 | 0.0 | 9.15485 | 0 | [0, 171] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_194441__939.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 142 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20240131_200250__109 | 0 | 0.0 | 1.45782 | 0 | [0, 179] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_200250__109.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 143 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20240131_200252__719 | 1 | 0.0 | 1.82176 | 4 | [0, 223] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_200252__719.json | 81.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 144 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20240131_200253__925 | 0 | 0.0 | 1.28569 | 0 | [0, 158] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_200253__925.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 145 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20240131_200255__358 | 0 | 0.0 | 1.9812 | 0 | [0, 243] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_200255__358.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 146 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20240131_200257__295 | 0 | 0.0 | 1.45955 | 0 | [0, 179] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_200257__295.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 147 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240131_200238__672 | 1 | 0.0 | 0.360509 | 2 | [0, 44] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_200238__672.json | 68.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 148 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240131_200239__470 | 1 | 0.0 | 0.490377 | 4 | [0, 60] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_200239__470.json | 81.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 149 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240131_200240__285 | 0 | 0.0 | 0.394359 | 0 | [0, 48] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_200240__285.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 150 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240131_200240__304 | 1 | 0.0 | 0.492035 | 4 | [0, 60] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_200240__304.json | 81.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 151 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240131_200241__289 | 0 | 0.0 | 0.392737 | 0 | [0, 48] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_200241__289.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 152 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20240131_200228__186 | 0 | 0.0 | 2.85795 | 0 | [0, 345] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_200228__186.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 153 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20240131_200230__241 | 0 | 0.0 | 2.09115 | 0 | [0, 254] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_200230__241.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 154 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20240131_200232__575 | 1 | 0.0 | 2.26998 | 1 | [0, 275] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_200232__575.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 155 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20240131_200235__891 | 0 | 0.0 | 2.46083 | 0 | [0, 298] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_200235__891.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 156 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20240131_200236__407 | 0 | 0.0 | 1.10007 | 0 | [0, 134] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_200236__407.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 157 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240131_200344__119 | 0 | 0.0 | 1.4978 | 0 | [0, 180] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_200344__119.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 158 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240131_200349__594 | 0 | 0.0 | 4.32691 | 0 | [0, 513] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_200349__594.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 159 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240131_200350__746 | 0 | 0.0 | 1.22229 | 0 | [0, 147] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_200350__746.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 160 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240131_200353__184 | 1 | 0.0 | 2.22557 | 1 | [0, 266] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_200353__184.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 161 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20240131_200355__878 | 0 | 0.0 | 2.6145 | 0 | [0, 312] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_200355__878.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 162 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20240131_200307__400 | 1 | 0.0 | 0.526284 | 1 | [0, 63] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_200307__400.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 163 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20240131_200309__243 | 0 | 0.0 | 2.56539 | 0 | [0, 305] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_200309__243.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 164 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20240131_200314__380 | 0 | 0.0 | 4.12777 | 0 | [0, 487] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_200314__380.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 165 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20240131_200317__369 | 1 | 0.0 | 3.02674 | 3 | [0, 359] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_200317__369.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 166 | NVIDIA-RTX-4090-4x | add_yearmonth | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20240131_200324__433 | 0 | 0.0 | 7.11797 | 0 | [0, 826] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_200324__433.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 167 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 4 | 20231219_205950__921 | 0 | 0.0 | 11.8906 | 0 | [66, 351] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_205950__921.json | 0.0 | missing | missing | missing | |
| 168 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 4 | 20231219_210004__584 | 0 | 0.0 | 13.85 | 0 | [1, 425] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_210004__584.json | 0.0 | missing | missing | missing | |
| 169 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 4 | 20231219_210023__320 | 0 | 0.0 | 18.3655 | 0 | [1, 553] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_210023__320.json | 0.0 | missing | missing | missing | |
| 170 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 4 | 20231224_222446__611 | 0 | 0.0 | 63.7589 | 0 | [80, 375] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231224_222446__611.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 171 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 4 | 20231224_222516__645 | 0 | 0.0 | 30.1172 | 0 | [80, 173] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231224_222516__645.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 172 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 4 | 20231219_205930__896 | 0 | 0.0 | 12.9329 | 0 | [1, 399] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_205930__896.json | 0.0 | missing | missing | missing | |
| 173 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 4 | 20231219_205938__989 | 0 | 0.0 | 8.49001 | 0 | [1, 260] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_205938__989.json | 25.0 | missing | missing | missing | |
| 174 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231224_222301__332 | 0 | 0.0 | 50.0536 | 0 | [83, 287] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231224_222301__332.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 175 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231224_222342__552 | 0 | 0.0 | 39.2743 | 0 | [83, 229] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231224_222342__552.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 176 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231226_210403__146 | 0 | 0.0 | 32.3636 | 0 | [83, 191] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231226_210403__146.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 177 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231219_205855__890 | 0 | 0.0 | 7.41282 | 0 | [1, 233] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_205855__890.json | 50.0 | missing | missing | missing | |
| 178 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231219_205901__340 | 0 | 0.0 | 5.4559 | 0 | [1, 172] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_205901__340.json | 0.0 | missing | missing | missing | |
| 179 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231224_222043__593 | 0 | 0.0 | 37.0856 | 0 | [124, 193] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_222043__593.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 180 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231224_222211__903 | 0 | 0.0 | 88.1791 | 0 | [124, 482] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_222211__903.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 181 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231226_210331__537 | 0 | 0.0 | 46.3106 | 0 | [124, 272] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_210331__537.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 182 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231219_205826__211 | 0 | 0.0 | 14.1975 | 0 | [1, 401] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_205826__211.json | 50.0 | missing | missing | missing | |
| 183 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231219_205837__773 | 0 | 0.0 | 11.4811 | 0 | [1, 340] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_205837__773.json | 25.0 | missing | missing | missing | |
| 184 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_221918__361 | 4 | 0.0 | 80.9647 | 4 | [252, 222] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_221918__361.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 185 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_222005__409 | 4 | 0.0 | 46.583 | 4 | [252, 217] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_222005__409.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 186 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231226_210244__148 | 4 | 0.0 | 77.3058 | 4 | [252, 283] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_210244__148.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 187 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231219_210157__128 | 0 | 0.0 | 20.067 | 0 | [1, 549] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_210157__128.json | 50.0 | missing | missing | missing | |
| 188 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231219_210215__790 | 0 | 0.0 | 18.1918 | 0 | [1, 501] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_210215__790.json | 0.0 | missing | missing | missing | |
| 189 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_222805__507 | 4 | 0.0 | 80.8156 | 4 | [412, 425] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_222805__507.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 190 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_222824__706 | 1 | 0.0 | 19.5209 | 3 | [412, 58] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_222824__706.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 191 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231226_210644__755 | 4 | 0.0 | 56.8096 | 4 | [412, 285] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_210644__755.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 192 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 4 | 20231219_210059__809 | 0 | 0.0 | 17.7074 | 0 | [1, 489] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_210059__809.json | 25.0 | missing | missing | missing | |
| 193 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231219_210122__439 | 0 | 0.0 | 22.2271 | 0 | [1, 603] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_210122__439.json | 50.0 | missing | missing | missing | |
| 194 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_222548__552 | 0 | 0.0 | 32.3384 | 0 | [410, 135] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_222548__552.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 195 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_222643__363 | 0 | 0.0 | 54.1194 | 0 | [410, 268] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_222643__363.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 196 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231226_210547__589 | 0 | 0.0 | 103.052 | 0 | [410, 559] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_210547__589.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 197 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 4 | 20231226_211452__883 | 0 | 0.0 | 5.42072 | 0 | [85, 207] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231226_211452__883.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 198 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 4 | 20231227_095541__986 | 0 | 0.0 | 13.561 | 0 | [85, 516] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_095541__986.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 199 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 4 | 20231227_095551__780 | 0 | 0.0 | 9.78193 | 0 | [85, 375] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_095551__780.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 200 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | true | 4 | 20231227_095559__848 | 0 | 0.0 | 8.36018 | 0 | [85, 321] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_095559__848.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 201 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231226_211446__834 | 0 | 0.0 | 1.89911 | 0 | [122, 63] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_211446__834.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 202 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_095522__559 | 0 | 0.0 | 7.51261 | 0 | [122, 283] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_095522__559.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 203 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_095526__194 | 0 | 0.0 | 4.2142 | 0 | [122, 155] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_095526__194.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 204 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231227_095527__797 | 0 | 0.0 | 1.5631 | 0 | [122, 49] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_095527__797.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 205 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231226_211444__679 | 0 | 0.0 | 10.4501 | 0 | [234, 246] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_211444__679.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 206 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231227_095458__906 | 0 | 0.0 | 9.44909 | 0 | [234, 212] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_095458__906.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 207 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_095508__391 | 0 | 0.0 | 10.0284 | 0 | [234, 356] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_095508__391.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 208 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231227_095514__173 | 0 | 0.0 | 6.26306 | 0 | [234, 215] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_095514__173.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 209 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231226_211518__804 | 0 | 0.0 | 13.9077 | 0 | [374, 465] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_211518__804.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 210 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231227_095631__152 | 0 | 0.0 | 10.6941 | 0 | [374, 352] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_095631__152.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 211 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_095636__478 | 0 | 0.0 | 4.92165 | 0 | [374, 141] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_095636__478.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 212 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_095646__355 | 0 | 0.0 | 10.7356 | 0 | [374, 353] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_095646__355.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 213 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 4 | 20231226_211504__648 | 0 | 0.0 | 12.2395 | 0 | [371, 407] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_211504__648.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 214 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 4 | 20231227_095606__405 | 0 | 0.0 | 6.97045 | 0 | [371, 217] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_095606__405.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 215 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_095613__194 | 0 | 0.0 | 6.77164 | 0 | [371, 210] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_095613__194.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 216 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | false | 4 | 20231227_095620__390 | 0 | 0.0 | 6.74354 | 0 | [371, 209] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_095620__390.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 217 | Apple-MacBook-Pro-M1 | add_yearmonth | gemini-1.0-pro-latest | InJulia | 1SHOT | false | false | 4 | 20240217_104048__300 | 0 | 0.0 | 2.29789 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_104048__300.json | 0.0 | missing | missing | missing | |
| 218 | Apple-MacBook-Pro-M1 | add_yearmonth | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 4 | 20240217_104059__559 | 0 | 0.0 | 10.0714 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_104059__559.json | 50.0 | missing | missing | missing | |
| 219 | Apple-MacBook-Pro-M1 | add_yearmonth | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 4 | 20240217_104101__129 | 0 | 0.0 | 1.96821 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_104101__129.json | 50.0 | missing | missing | missing | |
| 220 | Apple-MacBook-Pro-M1 | add_yearmonth | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 4 | 20240217_104105__241 | 0 | 0.0 | 3.90492 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_104105__241.json | 50.0 | missing | missing | missing | |
| 221 | Apple-MacBook-Pro-M1 | add_yearmonth | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 4 | 20240217_113811__303 | 0 | 0.0 | 4.52476 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_113811__303.json | 50.0 | missing | missing | missing | |
| 222 | Apple-MacBook-Pro-M1 | add_yearmonth | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240217_104017__430 | 0 | 0.0 | 1.98363 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_104017__430.json | 50.0 | missing | missing | missing | |
| 223 | Apple-MacBook-Pro-M1 | add_yearmonth | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240217_104027__690 | 0 | 0.0 | 9.10182 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_104027__690.json | 50.0 | missing | missing | missing | |
| 224 | Apple-MacBook-Pro-M1 | add_yearmonth | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240217_104028__967 | 0 | 0.0 | 1.8463 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_104028__967.json | 50.0 | missing | missing | missing | |
| 225 | Apple-MacBook-Pro-M1 | add_yearmonth | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240217_104031__806 | 0 | 0.0 | 2.3848 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_104031__806.json | 50.0 | missing | missing | missing | |
| 226 | Apple-MacBook-Pro-M1 | add_yearmonth | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240217_104033__433 | 0 | 0.0 | 2.00072 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_104033__433.json | 50.0 | missing | missing | missing | |
| 227 | Apple-MacBook-Pro-M1 | add_yearmonth | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20240217_103926__562 | 0 | 0.0 | 5.54728 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_103926__562.json | 25.0 | missing | missing | missing | |
| 228 | Apple-MacBook-Pro-M1 | add_yearmonth | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20240217_103934__263 | 0 | 0.0 | 7.93824 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_103934__263.json | 25.0 | missing | missing | missing | |
| 229 | Apple-MacBook-Pro-M1 | add_yearmonth | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20240217_103937__360 | 0 | 0.0 | 2.94072 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_103937__360.json | 25.0 | missing | missing | missing | |
| 230 | Apple-MacBook-Pro-M1 | add_yearmonth | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20240217_103941__498 | 0 | 0.0 | 3.86035 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_103941__498.json | 0.0 | missing | missing | missing | |
| 231 | Apple-MacBook-Pro-M1 | add_yearmonth | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20240217_103950__319 | 0 | 0.0 | 9.1595 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_103950__319.json | 50.0 | missing | missing | missing | |
| 232 | Apple-MacBook-Pro-M1 | add_yearmonth | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240217_104201__597 | 0 | 0.0 | 3.57433 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_104201__597.json | 50.0 | missing | missing | missing | |
| 233 | Apple-MacBook-Pro-M1 | add_yearmonth | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240217_104215__626 | 0 | 0.0 | 14.2833 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_104215__626.json | 50.0 | missing | missing | missing | |
| 234 | Apple-MacBook-Pro-M1 | add_yearmonth | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240217_104222__152 | 0 | 0.0 | 6.81103 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_104222__152.json | 50.0 | missing | missing | missing | |
| 235 | Apple-MacBook-Pro-M1 | add_yearmonth | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20240217_104233__256 | 0 | 0.0 | 11.301 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_104233__256.json | 25.0 | missing | missing | missing | |
| 236 | Apple-MacBook-Pro-M1 | add_yearmonth | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20240217_104245__528 | 0 | 0.0 | 11.2645 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_104245__528.json | 25.0 | missing | missing | missing | |
| 237 | Apple-MacBook-Pro-M1 | add_yearmonth | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 4 | 20240217_104122__401 | 0 | 0.0 | 1.95883 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_104122__401.json | 50.0 | missing | missing | missing | |
| 238 | Apple-MacBook-Pro-M1 | add_yearmonth | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | false | false | 4 | 20240217_104130__306 | 0 | 0.0 | 7.23995 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_104130__306.json | 0.0 | missing | missing | missing | |
| 239 | Apple-MacBook-Pro-M1 | add_yearmonth | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 4 | 20240217_104135__496 | 0 | 0.0 | 4.86413 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_104135__496.json | 50.0 | missing | missing | missing | |
| 240 | Apple-MacBook-Pro-M1 | add_yearmonth | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 4 | 20240217_104137__555 | 0 | 0.0 | 2.68328 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_104137__555.json | 50.0 | missing | missing | missing | |
| 241 | Apple-MacBook-Pro-M1 | add_yearmonth | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 4 | 20240217_104141__884 | 0 | 0.0 | 3.26962 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_104141__884.json | 50.0 | missing | missing | missing | |
| 242 | Apple-MacBook-Pro-M1 | add_yearmonth | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | false | 4 | 20240223_211957__783 | 0 | 0.0 | 26.2175 | 0 | [0, 404] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_211957__783.json | 25.0 | missing | missing | missing | |
| 243 | Apple-MacBook-Pro-M1 | add_yearmonth | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 4 | 20240223_212021__361 | 0 | 0.0 | 24.1517 | 0 | [0, 375] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_212021__361.json | 0.0 | missing | missing | missing | |
| 244 | Apple-MacBook-Pro-M1 | add_yearmonth | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | false | 4 | 20240223_212047__585 | 0 | 0.0 | 26.0207 | 0 | [0, 402] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_212047__585.json | 25.0 | missing | missing | missing | |
| 245 | Apple-MacBook-Pro-M1 | add_yearmonth | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 4 | 20240223_212111__721 | 0 | 0.0 | 23.2359 | 0 | [0, 362] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_212111__721.json | 0.0 | missing | missing | missing | |
| 246 | Apple-MacBook-Pro-M1 | add_yearmonth | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | true | 4 | 20240223_212133__594 | 0 | 0.0 | 22.5582 | 0 | [0, 349] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_212133__594.json | 50.0 | missing | missing | missing | |
| 247 | Apple-MacBook-Pro-M1 | add_yearmonth | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240223_211713__320 | 0 | 0.0 | 1.52268 | 0 | [0, 24] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_211713__320.json | 50.0 | missing | missing | missing | |
| 248 | Apple-MacBook-Pro-M1 | add_yearmonth | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240223_211715__518 | 0 | 0.0 | 1.57829 | 0 | [0, 25] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_211715__518.json | 50.0 | missing | missing | missing | |
| 249 | Apple-MacBook-Pro-M1 | add_yearmonth | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240223_211716__841 | 0 | 0.0 | 1.58304 | 0 | [0, 25] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_211716__841.json | 50.0 | missing | missing | missing | |
| 250 | Apple-MacBook-Pro-M1 | add_yearmonth | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240223_211727__793 | 0 | 0.0 | 10.8163 | 0 | [0, 169] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_211727__793.json | 50.0 | missing | missing | missing | |
| 251 | Apple-MacBook-Pro-M1 | add_yearmonth | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240223_211729__864 | 0 | 0.0 | 1.53533 | 0 | [0, 24] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_211729__864.json | 50.0 | missing | missing | missing | |
| 252 | Apple-MacBook-Pro-M1 | add_yearmonth | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20240223_211507__264 | 0 | 0.0 | 24.3438 | 0 | [0, 374] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_211507__264.json | 0.0 | missing | missing | missing | |
| 253 | Apple-MacBook-Pro-M1 | add_yearmonth | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20240223_211541__441 | 0 | 0.0 | 34.0774 | 0 | [0, 520] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_211541__441.json | 0.0 | missing | missing | missing | |
| 254 | Apple-MacBook-Pro-M1 | add_yearmonth | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20240223_211608__276 | 0 | 0.0 | 27.0333 | 0 | [0, 412] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_211608__276.json | 0.0 | missing | missing | missing | |
| 255 | Apple-MacBook-Pro-M1 | add_yearmonth | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20240223_211630__658 | 0 | 0.0 | 21.5496 | 0 | [0, 332] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_211630__658.json | 25.0 | missing | missing | missing | |
| 256 | Apple-MacBook-Pro-M1 | add_yearmonth | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20240223_211700__269 | 0 | 0.0 | 30.6501 | 0 | [0, 472] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_211700__269.json | 0.0 | missing | missing | missing | |
| 257 | Apple-MacBook-Pro-M1 | add_yearmonth | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240223_212840__872 | 0 | 0.0 | 25.7808 | 0 | [0, 397] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_212840__872.json | 50.0 | missing | missing | missing | |
| 258 | Apple-MacBook-Pro-M1 | add_yearmonth | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240223_212910__793 | 0 | 0.0 | 30.1527 | 0 | [0, 464] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_212910__793.json | 50.0 | missing | missing | missing | |
| 259 | Apple-MacBook-Pro-M1 | add_yearmonth | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20240223_212932__464 | 0 | 0.0 | 22.3749 | 0 | [0, 346] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_212932__464.json | 0.0 | missing | missing | missing | |
| 260 | Apple-MacBook-Pro-M1 | add_yearmonth | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240223_213006__871 | 0 | 0.0 | 33.4637 | 0 | [0, 514] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_213006__871.json | 50.0 | missing | missing | missing | |
| 261 | Apple-MacBook-Pro-M1 | add_yearmonth | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240223_213034__211 | 0 | 0.0 | 27.9294 | 0 | [0, 430] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_213034__211.json | 50.0 | missing | missing | missing | |
| 262 | Apple-MacBook-Pro-M1 | add_yearmonth | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 4 | 20240223_212417__179 | 0 | 0.0 | 21.8527 | 0 | [0, 338] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_212417__179.json | 0.0 | missing | missing | missing | |
| 263 | Apple-MacBook-Pro-M1 | add_yearmonth | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 4 | 20240223_212443__337 | 0 | 0.0 | 26.8797 | 0 | [0, 414] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_212443__337.json | 50.0 | missing | missing | missing | |
| 264 | Apple-MacBook-Pro-M1 | add_yearmonth | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | false | 4 | 20240223_212506__524 | 0 | 0.0 | 22.3571 | 0 | [0, 346] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_212506__524.json | 25.0 | missing | missing | missing | |
| 265 | Apple-MacBook-Pro-M1 | add_yearmonth | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 4 | 20240223_212525__438 | 0 | 0.0 | 19.3491 | 0 | [0, 298] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_212525__438.json | 0.0 | missing | missing | missing | |
| 266 | Apple-MacBook-Pro-M1 | add_yearmonth | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 4 | 20240223_212553__476 | 0 | 0.0 | 28.2341 | 0 | [0, 431] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_212553__476.json | 0.0 | missing | missing | missing | |
| 267 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 4 | 20231213_211438__690 | 0 | 0.0004515 | 6.72312 | 0 | [72, 277] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231213_211438__690.json | 0.0 | missing | missing | missing | |
| 268 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 4 | 20231225_181812__293 | 0 | 0.0003795 | 4.98917 | 0 | [72, 229] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_181812__293.json | 0.0 | missing | missing | missing | |
| 269 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 4 | 20231225_181817__417 | 0 | 0.000357 | 4.57924 | 0 | [72, 214] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_181817__417.json | 0.0 | missing | missing | missing | |
| 270 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo--optim | AsIs | 1SHOT | false | false | 4 | 20231215_184723__866 | 0 | 0.0 | 6.80391 | 0 | [72, 258] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231215_184723__866.json | 0.0 | 0.5 | missing | 0.5 | |
| 271 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 4 | 20231213_211431__467 | 0 | 0.0004515 | 5.99049 | 0 | [75, 276] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231213_211431__467.json | 50.0 | missing | missing | missing | |
| 272 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 4 | 20231225_181800__946 | 0 | 0.000339 | 4.10865 | 0 | [75, 201] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_181800__946.json | 50.0 | missing | missing | missing | |
| 273 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 4 | 20231225_181807__102 | 0 | 0.0005715 | 6.4554 | 0 | [75, 356] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_181807__102.json | 50.0 | missing | missing | missing | |
| 274 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 4 | 20231227_185129__323 | 0 | 0.000789 | 8.51393 | 0 | [75, 501] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_185129__323.json | 50.0 | missing | missing | missing | |
| 275 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 4 | 20231227_185135__118 | 0 | 0.0005955 | 6.56973 | 0 | [75, 372] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_185135__118.json | 50.0 | missing | missing | missing | |
| 276 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo--optim | InJulia | 1SHOT | true | true | 4 | 20231215_184716__541 | 0 | 0.0 | 8.21824 | 0 | [75, 372] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231215_184716__541.json | 50.0 | 0.5 | missing | 0.5 | |
| 277 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231213_211425__204 | 0 | 0.000277 | 3.7471 | 0 | [110, 148] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231213_211425__204.json | 50.0 | missing | missing | missing | |
| 278 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231225_181752__802 | 0 | 0.0002815 | 2.58455 | 0 | [110, 151] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_181752__802.json | 50.0 | missing | missing | missing | |
| 279 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231225_181756__324 | 1 | 0.000298 | 3.62927 | 1 | [110, 162] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_181756__324.json | 62.5 | missing | missing | missing | |
| 280 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_185117__917 | 0 | 0.0003865 | 4.2748 | 0 | [110, 221] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_185117__917.json | 50.0 | missing | missing | missing | |
| 281 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_185120__898 | 0 | 0.000259 | 2.85622 | 0 | [110, 136] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_185120__898.json | 50.0 | missing | missing | missing | |
| 282 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo--optim | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231215_184708__765 | 0 | 0.0 | 7.96635 | 0 | [110, 376] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231215_184708__765.json | 50.0 | 0.5 | missing | 0.5 | |
| 283 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231213_211421__744 | 0 | 0.0002885 | 3.94675 | 0 | [211, 122] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231213_211421__744.json | 0.0 | missing | missing | missing | |
| 284 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231225_181742__746 | 0 | 0.00029 | 3.90551 | 0 | [211, 123] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_181742__746.json | 0.0 | missing | missing | missing | |
| 285 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231225_181749__692 | 1 | 0.000677 | 6.48467 | 3 | [211, 381] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_181749__692.json | 75.0 | missing | missing | missing | |
| 286 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231227_185110__606 | 0 | 0.000662 | 7.34741 | 0 | [211, 371] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_185110__606.json | 25.0 | missing | missing | missing | |
| 287 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231227_185113__752 | 0 | 0.000314 | 2.68751 | 0 | [211, 139] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_185113__752.json | 0.0 | missing | missing | missing | |
| 288 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo--optim | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231215_184700__820 | 0 | 0.0 | 3.57948 | 0 | [211, 137] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231215_184700__820.json | 0.0 | 0.5 | missing | 0.5 | |
| 289 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231213_211448__527 | 0 | 0.000548 | 6.16651 | 0 | [334, 254] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231213_211448__527.json | 50.0 | missing | missing | missing | |
| 290 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231225_181830__813 | 0 | 0.000767 | 7.0642 | 0 | [334, 400] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_181830__813.json | 50.0 | missing | missing | missing | |
| 291 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231225_181837__722 | 1 | 0.000635 | 6.07225 | 1 | [334, 312] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_181837__722.json | 62.5 | missing | missing | missing | |
| 292 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231227_185143__340 | 0 | 0.0002855 | 1.5188 | 0 | [334, 79] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_185143__340.json | 0.0 | missing | missing | missing | |
| 293 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231227_185146__274 | 0 | 0.000362 | 2.31049 | 0 | [334, 130] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_185146__274.json | 0.0 | missing | missing | missing | |
| 294 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo--optim | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231215_184739__721 | 0 | 0.0 | 6.73031 | 0 | [334, 330] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231215_184739__721.json | 50.0 | 0.5 | missing | 0.5 | |
| 295 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 4 | 20231213_211442__133 | 0 | 0.00039 | 3.7376 | 0 | [333, 149] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231213_211442__133.json | 0.0 | missing | missing | missing | |
| 296 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | true | true | 4 | 20231225_181821__709 | 0 | 0.000498 | 4.26221 | 0 | [333, 221] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_181821__709.json | 50.0 | missing | missing | missing | |
| 297 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 4 | 20231225_181823__624 | 0 | 0.0003075 | 2.20217 | 0 | [333, 94] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_181823__624.json | 0.0 | missing | missing | missing | |
| 298 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 4 | 20231227_185139__950 | 0 | 0.0004065 | 3.0678 | 0 | [333, 160] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_185139__950.json | 0.0 | missing | missing | missing | |
| 299 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 4 | 20231227_185142__394 | 0 | 0.000444 | 3.2828 | 0 | [333, 185] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_185142__394.json | 0.0 | missing | missing | missing | |
| 300 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo--optim | JuliaRecapTask | 1SHOT | true | true | 4 | 20231215_184732__652 | 0 | 0.0 | 8.39342 | 0 | [333, 360] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231215_184732__652.json | 50.0 | 0.5 | missing | 0.5 | |
| 301 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 4 | 20240201_200118__921 | 0 | 0.0002385 | 1.10606 | 0 | [75, 134] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200118__921.json | 50.0 | missing | missing | missing | |
| 302 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | false | 4 | 20240201_200119__797 | 0 | 0.0002865 | 1.46989 | 0 | [75, 166] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200119__797.json | 25.0 | missing | missing | missing | |
| 303 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 4 | 20240201_200120__226 | 1 | 0.0001995 | 0.882261 | 1 | [75, 108] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200120__226.json | 62.5 | missing | missing | missing | |
| 304 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 4 | 20240201_200122__398 | 4 | 0.000315 | 1.56769 | 4 | [75, 185] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200122__398.json | 100.0 | missing | missing | missing | |
| 305 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 4 | 20240201_200124__645 | 0 | 0.0002325 | 1.24141 | 0 | [75, 130] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200124__645.json | 50.0 | missing | missing | missing | |
| 306 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240201_200114__950 | 0 | 0.000127 | 0.751227 | 0 | [110, 48] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200114__950.json | 50.0 | missing | missing | missing | |
| 307 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240201_200115__712 | 0 | 0.000124 | 0.779156 | 0 | [110, 46] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200115__712.json | 50.0 | missing | missing | missing | |
| 308 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240201_200115__788 | 4 | 0.0001225 | 0.751311 | 4 | [110, 45] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200115__788.json | 100.0 | missing | missing | missing | |
| 309 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240201_200116__190 | 4 | 0.0001225 | 0.578463 | 4 | [110, 45] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200116__190.json | 100.0 | missing | missing | missing | |
| 310 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240201_200117__350 | 4 | 0.0001195 | 0.587046 | 4 | [110, 43] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200117__350.json | 100.0 | missing | missing | missing | |
| 311 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20240201_200108__564 | 1 | 0.000257 | 1.09689 | 4 | [211, 101] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200108__564.json | 81.25 | missing | missing | missing | |
| 312 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20240201_200109__328 | 4 | 0.000278 | 1.00124 | 4 | [211, 115] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200109__328.json | 100.0 | missing | missing | missing | |
| 313 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20240201_200110__266 | 4 | 0.000266 | 0.908459 | 4 | [211, 107] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200110__266.json | 100.0 | missing | missing | missing | |
| 314 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20240201_200112__687 | 4 | 0.000377 | 1.56354 | 4 | [211, 181] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200112__687.json | 100.0 | missing | missing | missing | |
| 315 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20240201_200113__141 | 4 | 0.0002465 | 0.853601 | 4 | [211, 94] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200113__141.json | 100.0 | missing | missing | missing | |
| 316 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20240201_200129__138 | 0 | 0.000317 | 0.924106 | 0 | [334, 100] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200129__138.json | 0.0 | missing | missing | missing | |
| 317 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240201_200131__214 | 4 | 0.0004355 | 1.62362 | 4 | [334, 179] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200131__214.json | 100.0 | missing | missing | missing | |
| 318 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20240201_200131__835 | 0 | 0.0002975 | 0.859286 | 0 | [334, 87] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200131__835.json | 0.0 | missing | missing | missing | |
| 319 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20240201_200132__605 | 0 | 0.000296 | 0.963006 | 0 | [334, 86] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200132__605.json | 0.0 | missing | missing | missing | |
| 320 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20240201_200134__731 | 0 | 0.000311 | 1.09957 | 0 | [334, 96] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200134__731.json | 0.0 | missing | missing | missing | |
| 321 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 4 | 20240201_200124__335 | 0 | 0.000237 | 0.75276 | 0 | [333, 47] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200124__335.json | 50.0 | missing | missing | missing | |
| 322 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 4 | 20240201_200125__136 | 0 | 0.0002295 | 0.731522 | 0 | [333, 42] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200125__136.json | 50.0 | missing | missing | missing | |
| 323 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 4 | 20240201_200126__349 | 4 | 0.000231 | 0.72085 | 4 | [333, 43] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200126__349.json | 100.0 | missing | missing | missing | |
| 324 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 4 | 20240201_200127__811 | 1 | 0.0002535 | 0.822307 | 1 | [333, 58] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200127__811.json | 62.5 | missing | missing | missing | |
| 325 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 4 | 20240201_200128__299 | 1 | 0.0002535 | 0.721934 | 1 | [333, 58] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200128__299.json | 62.5 | missing | missing | missing | |
| 326 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 4 | 20231213_211456__606 | 0 | 0.0004 | 2.96632 | 0 | [72, 164] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231213_211456__606.json | 0.0 | missing | missing | missing | |
| 327 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 4 | 20231225_181850__278 | 0 | 0.000272 | 2.00236 | 0 | [72, 100] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_181850__278.json | 0.0 | missing | missing | missing | |
| 328 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 4 | 20231225_181852__428 | 0 | 0.000268 | 1.92477 | 0 | [72, 98] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_181852__428.json | 0.0 | missing | missing | missing | |
| 329 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106--optim | AsIs | 1SHOT | false | false | 4 | 20231215_184753__946 | 0 | 0.0 | 5.3291 | 0 | [72, 218] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231215_184753__946.json | 0.0 | 0.9 | missing | 0.1 | |
| 330 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 4 | 20231213_211453__473 | 4 | 0.000335 | 2.1802 | 4 | [75, 130] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231213_211453__473.json | 100.0 | missing | missing | missing | |
| 331 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 4 | 20231225_181846__589 | 0 | 0.000529 | 3.49558 | 0 | [75, 227] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_181846__589.json | 50.0 | missing | missing | missing | |
| 332 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 4 | 20231225_181848__472 | 0 | 0.000331 | 2.36465 | 0 | [75, 128] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_181848__472.json | 50.0 | missing | missing | missing | |
| 333 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 4 | 20231227_185154__216 | 4 | 0.000431 | 3.33907 | 4 | [75, 178] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_185154__216.json | 100.0 | missing | missing | missing | |
| 334 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 4 | 20231227_185157__867 | 1 | 0.000265 | 2.14967 | 1 | [75, 95] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_185157__867.json | 62.5 | missing | missing | missing | |
| 335 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106--optim | InJulia | 1SHOT | true | true | 4 | 20231215_184748__328 | 0 | 0.0 | 5.40872 | 0 | [75, 162] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231215_184748__328.json | 50.0 | 0.9 | missing | 0.1 | |
| 336 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231213_211451__547 | 1 | 0.000196 | 1.22267 | 4 | [110, 43] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231213_211451__547.json | 81.25 | missing | missing | missing | |
| 337 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231225_181841__333 | 0 | 0.000198 | 0.998436 | 0 | [110, 44] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_181841__333.json | 50.0 | missing | missing | missing | |
| 338 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231225_181842__319 | 4 | 0.000196 | 1.0857 | 4 | [110, 43] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_181842__319.json | 100.0 | missing | missing | missing | |
| 339 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_185150__280 | 4 | 0.000196 | 1.43557 | 4 | [110, 43] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_185150__280.json | 100.0 | missing | missing | missing | |
| 340 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_185151__163 | 4 | 0.000188 | 0.953849 | 4 | [110, 39] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_185151__163.json | 100.0 | missing | missing | missing | |
| 341 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106--optim | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231215_184742__617 | 4 | 0.0 | 1.79637 | 4 | [110, 43] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231215_184742__617.json | 100.0 | 0.9 | missing | 0.1 | |
| 342 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231213_211449__302 | 4 | 0.000297 | 0.963451 | 4 | [211, 43] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231213_211449__302.json | 100.0 | missing | missing | missing | |
| 343 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231225_181838__341 | 4 | 0.000293 | 1.36626 | 4 | [211, 41] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_181838__341.json | 100.0 | missing | missing | missing | |
| 344 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231225_181840__834 | 4 | 0.000297 | 1.25362 | 4 | [211, 43] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_181840__834.json | 100.0 | missing | missing | missing | |
| 345 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_185147__509 | 4 | 0.000299 | 1.12141 | 4 | [211, 44] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_185147__509.json | 100.0 | missing | missing | missing | |
| 346 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_185148__814 | 4 | 0.000297 | 1.45314 | 4 | [211, 43] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_185148__814.json | 100.0 | missing | missing | missing | |
| 347 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106--optim | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231215_184740__660 | 4 | 0.0 | 1.37381 | 4 | [211, 43] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231215_184740__660.json | 100.0 | 0.9 | missing | 0.1 | |
| 348 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231213_211459__950 | 0 | 0.000546 | 2.1309 | 0 | [334, 106] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231213_211459__950.json | 0.0 | missing | missing | missing | |
| 349 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231225_181859__204 | 0 | 0.00051 | 2.226 | 0 | [334, 88] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_181859__204.json | 0.0 | missing | missing | missing | |
| 350 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231225_181900__676 | 0 | 0.00053 | 1.61234 | 0 | [334, 98] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_181900__676.json | 0.0 | missing | missing | missing | |
| 351 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231227_185202__526 | 0 | 0.000602 | 2.56099 | 0 | [334, 134] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_185202__526.json | 0.0 | missing | missing | missing | |
| 352 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231227_185205__679 | 0 | 0.00059 | 2.92231 | 0 | [334, 128] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_185205__679.json | 0.0 | missing | missing | missing | |
| 353 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106--optim | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231215_184758__965 | 0 | 0.0 | 2.6301 | 0 | [334, 97] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231215_184758__965.json | 0.0 | 0.9 | missing | 0.1 | |
| 354 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 4 | 20231213_211457__557 | 4 | 0.000419 | 1.13438 | 4 | [333, 43] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231213_211457__557.json | 100.0 | missing | missing | missing | |
| 355 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 4 | 20231225_181854__981 | 0 | 0.000491 | 1.84441 | 0 | [333, 79] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_181854__981.json | 50.0 | missing | missing | missing | |
| 356 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 4 | 20231225_181855__519 | 1 | 0.000415 | 1.16994 | 1 | [333, 41] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_181855__519.json | 62.5 | missing | missing | missing | |
| 357 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_185158__401 | 1 | 0.000449 | 1.3813 | 1 | [333, 58] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_185158__401.json | 62.5 | missing | missing | missing | |
| 358 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_185200__662 | 1 | 0.000425 | 1.07428 | 1 | [333, 46] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_185200__662.json | 62.5 | missing | missing | missing | |
| 359 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-3.5-turbo-1106--optim | JuliaRecapTask | 1SHOT | true | true | 4 | 20231215_184755__379 | 4 | 0.0 | 1.93017 | 4 | [333, 43] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231215_184755__379.json | 100.0 | 0.9 | missing | 0.1 | |
| 360 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 4 | 20240201_063725__502 | 0 | 0.0153 | 36.2399 | 0 | [75, 485] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_063725__502.json | 50.0 | missing | missing | missing | |
| 361 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 4 | 20240201_063811__651 | 4 | 0.01674 | 45.5416 | 2 | [75, 533] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_063811__651.json | 87.5 | missing | missing | missing | |
| 362 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 4 | 20240201_063847__360 | 1 | 0.01338 | 36.4069 | 4 | [75, 421] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_063847__360.json | 81.25 | missing | missing | missing | |
| 363 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-0125-preview | InJulia | 1SHOT | true | false | 4 | 20240201_063922__903 | 0 | 0.01353 | 34.3451 | 0 | [75, 426] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_063922__903.json | 25.0 | missing | missing | missing | |
| 364 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 4 | 20240201_064040__679 | 0 | 0.01884 | 78.4609 | 0 | [75, 603] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_064040__679.json | 50.0 | missing | missing | missing | |
| 365 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240201_063255__325 | 4 | 0.00242 | 4.05859 | 4 | [110, 44] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_063255__325.json | 100.0 | missing | missing | missing | |
| 366 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240201_063259__941 | 4 | 0.00242 | 3.99829 | 4 | [110, 44] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_063259__941.json | 100.0 | missing | missing | missing | |
| 367 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240201_063303__227 | 4 | 0.00236 | 3.78836 | 4 | [110, 42] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_063303__227.json | 100.0 | missing | missing | missing | |
| 368 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240201_063307__488 | 4 | 0.00236 | 3.78867 | 4 | [110, 42] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_063307__488.json | 100.0 | missing | missing | missing | |
| 369 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 4 | 20240201_063313__782 | 1 | 0.00332 | 5.99143 | 1 | [110, 74] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_063313__782.json | 62.5 | missing | missing | missing | |
| 370 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20240201_063039__456 | 4 | 0.00985 | 24.4551 | 4 | [211, 258] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_063039__456.json | 100.0 | missing | missing | missing | |
| 371 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20240201_063055__797 | 4 | 0.00757 | 15.8627 | 4 | [211, 182] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_063055__797.json | 100.0 | missing | missing | missing | |
| 372 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20240201_063127__544 | 4 | 0.01243 | 32.5677 | 4 | [211, 344] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_063127__544.json | 100.0 | missing | missing | missing | |
| 373 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20240201_063207__187 | 4 | 0.01492 | 39.4594 | 4 | [211, 427] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_063207__187.json | 100.0 | missing | missing | missing | |
| 374 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20240201_063229__526 | 4 | 0.0088 | 21.7173 | 4 | [211, 223] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_063229__526.json | 100.0 | missing | missing | missing | |
| 375 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20240201_065114__456 | 0 | 0.01843 | 39.3639 | 0 | [334, 503] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_065114__456.json | 25.0 | missing | missing | missing | |
| 376 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240201_065224__277 | 0 | 0.02023 | 70.3607 | 0 | [334, 563] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_065224__277.json | 50.0 | missing | missing | missing | |
| 377 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240201_065248__264 | 4 | 0.01315 | 24.0883 | 2 | [334, 327] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_065248__264.json | 87.5 | missing | missing | missing | |
| 378 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240201_065316__161 | 0 | 0.01837 | 27.467 | 0 | [334, 501] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_065316__161.json | 50.0 | missing | missing | missing | |
| 379 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20240201_065407__532 | 1 | 0.0208 | 51.1238 | 4 | [334, 582] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_065407__532.json | 81.25 | missing | missing | missing | |
| 380 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | false | 4 | 20240201_064455__245 | 0 | 0.01785 | 56.5654 | 0 | [333, 484] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_064455__245.json | 25.0 | missing | missing | missing | |
| 381 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 4 | 20240201_064534__215 | 4 | 0.01599 | 39.0363 | 4 | [333, 422] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_064534__215.json | 100.0 | missing | missing | missing | |
| 382 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 4 | 20240201_064612__813 | 0 | 0.01551 | 37.5222 | 0 | [333, 406] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_064612__813.json | 50.0 | missing | missing | missing | |
| 383 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 4 | 20240201_064645__355 | 4 | 0.01806 | 33.0739 | 4 | [333, 491] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_064645__355.json | 100.0 | missing | missing | missing | |
| 384 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 4 | 20240201_064716__670 | 1 | 0.01491 | 31.188 | 4 | [333, 386] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_064716__670.json | 81.25 | missing | missing | missing | |
| 385 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 4 | 20231213_211657__775 | 0 | 0.01311 | 35.2411 | 0 | [72, 413] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231213_211657__775.json | 0.0 | missing | missing | missing | |
| 386 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 4 | 20231225_182107__592 | 0 | 0.01164 | 22.2403 | 0 | [72, 364] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_182107__592.json | 0.0 | missing | missing | missing | |
| 387 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 4 | 20231225_182140__122 | 0 | 0.01392 | 32.7577 | 0 | [72, 440] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_182140__122.json | 0.0 | missing | missing | missing | |
| 388 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview--optim | AsIs | 1SHOT | false | false | 4 | 20231215_185011__745 | 0 | 0.0 | 35.4972 | 0 | [72, 305] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231215_185011__745.json | 0.0 | 0.1 | missing | 0.9 | |
| 389 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 4 | 20231213_211622__563 | 0 | 0.01023 | 43.0224 | 0 | [75, 316] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231213_211622__563.json | 50.0 | missing | missing | missing | |
| 390 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 4 | 20231225_182015__644 | 0 | 0.01509 | 23.9905 | 0 | [75, 478] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_182015__644.json | 50.0 | missing | missing | missing | |
| 391 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 4 | 20231225_182045__837 | 0 | 0.01284 | 29.4192 | 0 | [75, 403] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_182045__837.json | 50.0 | missing | missing | missing | |
| 392 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | InJulia | 1SHOT | true | false | 4 | 20231227_185405__113 | 0 | 0.01689 | 30.4644 | 0 | [75, 538] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_185405__113.json | 25.0 | missing | missing | missing | |
| 393 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 4 | 20231227_185431__989 | 0 | 0.01182 | 25.9859 | 0 | [75, 369] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_185431__989.json | 50.0 | missing | missing | missing | |
| 394 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview--optim | InJulia | 1SHOT | true | true | 4 | 20231215_184935__423 | 0 | 0.0 | 56.5906 | 0 | [75, 477] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231215_184935__423.json | 50.0 | 0.1 | missing | 0.9 | |
| 395 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231213_211538__347 | 1 | 0.00449 | 9.11277 | 3 | [110, 113] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231213_211538__347.json | 75.0 | missing | missing | missing | |
| 396 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231225_181944__237 | 4 | 0.00626 | 7.41735 | 4 | [110, 172] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_181944__237.json | 100.0 | missing | missing | missing | |
| 397 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231225_181951__679 | 4 | 0.00527 | 6.84649 | 4 | [110, 139] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_181951__679.json | 100.0 | missing | missing | missing | |
| 398 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_185324__676 | 4 | 0.00524 | 12.4056 | 4 | [110, 138] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_185324__676.json | 100.0 | missing | missing | missing | |
| 399 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_185335__909 | 4 | 0.00689 | 10.7934 | 4 | [110, 193] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_185335__909.json | 100.0 | missing | missing | missing | |
| 400 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview--optim | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231215_184838__938 | 4 | 0.0 | 12.3766 | 4 | [110, 123] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231215_184838__938.json | 100.0 | 0.1 | missing | 0.9 | |
| 401 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231213_211529__534 | 0 | 0.01426 | 29.5579 | 0 | [211, 405] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231213_211529__534.json | 25.0 | missing | missing | missing | |
| 402 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231225_181926__155 | 4 | 0.01642 | 25.7756 | 4 | [211, 477] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_181926__155.json | 100.0 | missing | missing | missing | |
| 403 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231225_181936__698 | 4 | 0.00793 | 9.81011 | 4 | [211, 194] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_181936__698.json | 100.0 | missing | missing | missing | |
| 404 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_185243__694 | 4 | 0.01045 | 37.3805 | 4 | [211, 278] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_185243__694.json | 100.0 | missing | missing | missing | |
| 405 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_185311__308 | 4 | 0.01717 | 28.3739 | 4 | [211, 502] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_185311__308.json | 100.0 | missing | missing | missing | |
| 406 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview--optim | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231215_184826__583 | 4 | 0.0 | 28.0287 | 4 | [211, 385] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231215_184826__583.json | 100.0 | 0.1 | missing | 0.9 | |
| 407 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231213_211815__586 | 0 | 0.01609 | 35.2746 | 0 | [334, 425] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231213_211815__586.json | 50.0 | missing | missing | missing | |
| 408 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231225_182254__460 | 0 | 0.01513 | 30.052 | 0 | [334, 393] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_182254__460.json | 50.0 | missing | missing | missing | |
| 409 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231225_182332__216 | 0 | 0.01867 | 37.7199 | 0 | [334, 511] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_182332__216.json | 50.0 | missing | missing | missing | |
| 410 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_185637__713 | 1 | 0.01657 | 41.3708 | 4 | [334, 441] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_185637__713.json | 81.25 | missing | missing | missing | |
| 411 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_185657__723 | 4 | 0.01195 | 19.1839 | 4 | [334, 287] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_185657__723.json | 100.0 | missing | missing | missing | |
| 412 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview--optim | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231215_185121__281 | 0 | 0.0 | 27.6373 | 0 | [334, 359] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231215_185121__281.json | 50.0 | 0.1 | missing | 0.9 | |
| 413 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 4 | 20231213_211739__415 | 0 | 0.01908 | 42.2722 | 0 | [333, 525] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231213_211739__415.json | 50.0 | missing | missing | missing | |
| 414 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 4 | 20231225_182209__915 | 4 | 0.01476 | 29.5884 | 4 | [333, 381] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_182209__915.json | 100.0 | missing | missing | missing | |
| 415 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 4 | 20231225_182224__691 | 1 | 0.01563 | 14.4305 | 4 | [333, 410] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_182224__691.json | 81.25 | missing | missing | missing | |
| 416 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_185513__375 | 1 | 0.01275 | 41.4783 | 4 | [333, 314] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_185513__375.json | 81.25 | missing | missing | missing | |
| 417 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_185556__641 | 0 | 0.01785 | 42.3043 | 0 | [333, 484] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_185556__641.json | 50.0 | missing | missing | missing | |
| 418 | Apple-MacBook-Pro-M1 | add_yearmonth | gpt-4-1106-preview--optim | JuliaRecapTask | 1SHOT | true | true | 4 | 20231215_185053__574 | 0 | 0.0 | 42.6558 | 0 | [333, 434] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231215_185053__574.json | 50.0 | 0.1 | missing | 0.9 | |
| 419 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | AsIs | 1SHOT | false | false | 4 | 20231213_230236__725 | 0 | 0.0 | 13.6328 | 0 | [66, 402] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__AsIs__1SHOT__20231213_230236__725.json | 0.0 | missing | missing | missing | |
| 420 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | AsIs | 1SHOT | false | false | 4 | 20231224_214048__273 | 0 | 0.0 | 12.7355 | 0 | [66, 379] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__AsIs__1SHOT__20231224_214048__273.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 421 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | AsIs | 1SHOT | false | false | 4 | 20231224_214103__187 | 0 | 0.0 | 15.6183 | 0 | [1, 475] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__AsIs__1SHOT__20231224_214103__187.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 422 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | AsIs | 1SHOT | false | false | 4 | 20231226_203525__348 | 0 | 0.0 | 8.97974 | 0 | [66, 268] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__AsIs__1SHOT__20231226_203525__348.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 423 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | AsIs | 1SHOT | false | false | 4 | 20231226_203539__295 | 0 | 0.0 | 14.2925 | 0 | [1, 440] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__AsIs__1SHOT__20231226_203539__295.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 424 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | InJulia | 1SHOT | true | true | 4 | 20231224_214015__496 | 0 | 0.0 | 7.84819 | 0 | [82, 232] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__InJulia__1SHOT__20231224_214015__496.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 425 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | InJulia | 1SHOT | true | true | 4 | 20231224_214035__183 | 0 | 0.0 | 19.7087 | 0 | [1, 589] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__InJulia__1SHOT__20231224_214035__183.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 426 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | InJulia | 1SHOT | true | true | 4 | 20231226_203505__279 | 0 | 0.0 | 17.0546 | 0 | [82, 502] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__InJulia__1SHOT__20231226_203505__279.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 427 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | InJulia | 1SHOT | true | true | 4 | 20231226_203516__281 | 0 | 0.0 | 10.5221 | 0 | [1, 324] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__InJulia__1SHOT__20231226_203516__281.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 428 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | InJulia | 1SHOT | true | true | 4 | 20231226_204455__223 | 0 | 0.0 | 32.6698 | 0 | [82, 909] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__InJulia__1SHOT__20231226_204455__223.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 429 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_213957__497 | 0 | 0.0 | 9.37755 | 0 | [112, 267] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaExpertAsk__1SHOT__20231224_213957__497.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 430 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231224_214007__226 | 0 | 0.0 | 8.22632 | 0 | [1, 253] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaExpertAsk__1SHOT__20231224_214007__226.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 431 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231226_203439__916 | 0 | 0.0 | 7.46196 | 0 | [112, 209] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaExpertAsk__1SHOT__20231226_203439__916.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 432 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231226_203448__728 | 0 | 0.0 | 9.34418 | 0 | [1, 289] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaExpertAsk__1SHOT__20231226_203448__728.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 433 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231226_204422__111 | 0 | 0.0 | 6.61563 | 0 | [112, 181] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaExpertAsk__1SHOT__20231226_204422__111.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 434 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231224_213929__413 | 0 | 0.0 | 22.6526 | 0 | [257, 348] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231224_213929__413.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 435 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231224_213947__378 | 0 | 0.0 | 17.3161 | 0 | [1, 493] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231224_213947__378.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 436 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231226_203420__902 | 0 | 0.0 | 20.0184 | 0 | [257, 366] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_203420__902.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 437 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231226_203431__638 | 0 | 0.0 | 11.5576 | 0 | [1, 339] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_203431__638.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 438 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231226_204414__832 | 0 | 0.0 | 23.9462 | 0 | [257, 395] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_204414__832.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 439 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231213_230308__132 | 0 | 0.0 | 14.0784 | 0 | [11, 382] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231213_230308__132.json | 50.0 | missing | missing | missing | |
| 440 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231224_214151__944 | 0 | 0.0 | 16.1789 | 0 | [11, 442] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231224_214151__944.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 441 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_214220__279 | 0 | 0.0 | 29.3745 | 0 | [1, 774] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231224_214220__279.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 442 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231226_203647__528 | 0 | 0.0 | 19.107 | 0 | [11, 519] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_203647__528.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 443 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231226_204536__900 | 0 | 0.0 | 18.173 | 0 | [11, 497] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_204536__900.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 444 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaRecapTask | 1SHOT | true | false | 4 | 20231224_214121__105 | 0 | 0.0 | 17.319 | 0 | [383, 387] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaRecapTask__1SHOT__20231224_214121__105.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 445 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_214135__489 | 0 | 0.0 | 13.7519 | 0 | [1, 385] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaRecapTask__1SHOT__20231224_214135__489.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 446 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaRecapTask | 1SHOT | true | true | 4 | 20231226_203603__890 | 0 | 0.0 | 23.746 | 0 | [383, 554] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaRecapTask__1SHOT__20231226_203603__890.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 447 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaRecapTask | 1SHOT | false | false | 4 | 20231226_203627__976 | 0 | 0.0 | 24.1853 | 0 | [1, 651] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaRecapTask__1SHOT__20231226_203627__976.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 448 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaRecapTask | 1SHOT | false | false | 4 | 20231226_204517__690 | 0 | 0.0 | 22.2887 | 0 | [383, 513] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaRecapTask__1SHOT__20231226_204517__690.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 449 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | AsIs | 1SHOT | false | false | 4 | 20231213_231138__209 | 0 | 0.0 | 12.0618 | 0 | [66, 359] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__AsIs__1SHOT__20231213_231138__209.json | 0.0 | missing | missing | missing | |
| 450 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | AsIs | 1SHOT | false | false | 4 | 20231224_220236__125 | 0 | 0.0 | 6.66944 | 0 | [80, 214] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__AsIs__1SHOT__20231224_220236__125.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 451 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | AsIs | 1SHOT | false | false | 4 | 20231224_220242__955 | 0 | 0.0 | 6.46164 | 0 | [80, 207] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__AsIs__1SHOT__20231224_220242__955.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 452 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | InJulia | 1SHOT | false | false | 4 | 20231213_231126__920 | 0 | 0.0 | 14.8593 | 0 | [82, 437] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__InJulia__1SHOT__20231213_231126__920.json | 0.0 | missing | missing | missing | |
| 453 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | InJulia | 1SHOT | true | true | 4 | 20231224_220223__782 | 0 | 0.0 | 11.0284 | 0 | [82, 360] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__InJulia__1SHOT__20231224_220223__782.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 454 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | InJulia | 1SHOT | true | true | 4 | 20231224_220229__218 | 0 | 0.0 | 5.76961 | 0 | [82, 183] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__InJulia__1SHOT__20231224_220229__218.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 455 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | InJulia | 1SHOT | true | true | 4 | 20231226_205457__290 | 0 | 0.0 | 6.93698 | 0 | [82, 224] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__InJulia__1SHOT__20231226_205457__290.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 456 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231213_231111__168 | 0 | 0.0 | 9.17501 | 0 | [112, 260] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231213_231111__168.json | 50.0 | missing | missing | missing | |
| 457 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_220207__671 | 0 | 0.0 | 7.37878 | 0 | [122, 232] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231224_220207__671.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 458 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_220212__569 | 0 | 0.0 | 4.6755 | 0 | [122, 141] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231224_220212__569.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 459 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231226_205450__204 | 0 | 0.0 | 8.69468 | 0 | [122, 277] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231226_205450__204.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 460 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231213_231102__246 | 0 | 0.0 | 21.7643 | 0 | [239, 563] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231213_231102__246.json | 25.0 | missing | missing | missing | |
| 461 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231224_220149__428 | 0 | 0.0 | 16.5262 | 0 | [249, 313] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231224_220149__428.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 462 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_220159__582 | 0 | 0.0 | 10.3947 | 0 | [249, 308] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231224_220159__582.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 463 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231226_205441__853 | 0 | 0.0 | 15.2213 | 0 | [249, 277] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231226_205441__853.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 464 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231213_231216__154 | 0 | 0.0 | 20.0306 | 0 | [11, 540] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231213_231216__154.json | 50.0 | missing | missing | missing | |
| 465 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_220311__471 | 0 | 0.0 | 11.0945 | 0 | [386, 302] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231224_220311__471.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 466 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_220324__277 | 0 | 0.0 | 12.1812 | 0 | [386, 336] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231224_220324__277.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 467 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231226_205513__194 | 0 | 0.0 | 10.079 | 0 | [386, 271] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231226_205513__194.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 468 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaRecapTask | 1SHOT | true | true | 4 | 20231213_231156__788 | 0 | 0.0 | 18.1615 | 0 | [383, 410] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaRecapTask__1SHOT__20231213_231156__788.json | 50.0 | missing | missing | missing | |
| 469 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaRecapTask | 1SHOT | true | false | 4 | 20231224_220251__379 | 0 | 0.0 | 8.74037 | 0 | [383, 232] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaRecapTask__1SHOT__20231224_220251__379.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 470 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_220300__977 | 0 | 0.0 | 9.06196 | 0 | [383, 242] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaRecapTask__1SHOT__20231224_220300__977.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 471 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaRecapTask | 1SHOT | true | true | 4 | 20231226_205503__695 | 0 | 0.0 | 5.89635 | 0 | [383, 141] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaRecapTask__1SHOT__20231226_205503__695.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 472 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 4 | 20231227_174325__527 | 0 | 0.0 | 11.9271 | 0 | [82, 223] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_174325__527.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 473 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 4 | 20231227_174339__424 | 0 | 0.0 | 14.3941 | 0 | [82, 276] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_174339__424.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 474 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 4 | 20231227_174354__564 | 0 | 0.0 | 14.4403 | 0 | [82, 279] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_174354__564.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 475 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_174254__941 | 0 | 0.0 | 10.6599 | 0 | [122, 199] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_174254__941.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 476 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_174303__589 | 0 | 0.0 | 8.87612 | 0 | [122, 163] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_174303__589.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 477 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_174313__328 | 0 | 0.0 | 9.56752 | 0 | [122, 173] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_174313__328.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 478 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_174205__934 | 0 | 0.0 | 21.9707 | 0 | [249, 262] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_174205__934.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 479 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231227_174226__547 | 0 | 0.0 | 20.7435 | 0 | [249, 376] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_174226__547.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 480 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231227_174243__107 | 0 | 0.0 | 17.5298 | 0 | [249, 318] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_174243__107.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 481 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_174442__730 | 0 | 0.0 | 14.3644 | 0 | [386, 237] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_174442__730.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 482 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_174454__806 | 0 | 0.0 | 12.2136 | 0 | [386, 196] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_174454__806.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 483 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_174510__987 | 0 | 0.0 | 16.2236 | 0 | [386, 252] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_174510__987.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 484 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_174406__571 | 0 | 0.0 | 12.2934 | 0 | [383, 200] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_174406__571.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 485 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_174416__969 | 0 | 0.0 | 9.19656 | 0 | [383, 139] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_174416__969.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 486 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_174427__514 | 0 | 0.0 | 11.4643 | 0 | [383, 180] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_174427__514.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 487 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | AsIs | 1SHOT | false | false | 4 | 20231213_212013__912 | 0 | 0.00398054 | 12.8441 | 0 | [78, 466] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__AsIs__1SHOT__20231213_212013__912.json | 0.0 | missing | missing | missing | |
| 488 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | AsIs | 1SHOT | false | false | 4 | 20231225_182707__987 | 0 | 0.00333334 | 8.59628 | 0 | [78, 386] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__AsIs__1SHOT__20231225_182707__987.json | 0.0 | missing | missing | missing | |
| 489 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | AsIs | 1SHOT | false | false | 4 | 20231225_182720__256 | 0 | 0.00458729 | 12.2366 | 0 | [78, 541] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__AsIs__1SHOT__20231225_182720__256.json | 0.0 | missing | missing | missing | |
| 490 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium--optim | AsIs | 1SHOT | false | false | 4 | 20231215_185518__777 | 0 | 0.0 | 25.3563 | 0 | [78, 256] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__AsIs__1SHOT__20231215_185518__777.json | 0.0 | 0.9 | missing | 0.3 | |
| 491 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | InJulia | 1SHOT | true | true | 4 | 20231213_212000__209 | 0 | 0.00401021 | 14.1316 | 0 | [80, 469] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__InJulia__1SHOT__20231213_212000__209.json | 50.0 | missing | missing | missing | |
| 492 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | InJulia | 1SHOT | true | true | 4 | 20231225_182648__407 | 0 | 0.00277244 | 14.9225 | 0 | [80, 316] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__InJulia__1SHOT__20231225_182648__407.json | 50.0 | missing | missing | missing | |
| 493 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | InJulia | 1SHOT | true | false | 4 | 20231225_182659__394 | 0 | 0.00333065 | 10.6898 | 0 | [80, 385] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__InJulia__1SHOT__20231225_182659__394.json | 25.0 | missing | missing | missing | |
| 494 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | InJulia | 1SHOT | true | true | 4 | 20231227_185941__955 | 0 | 0.0047464 | 12.7109 | 0 | [80, 560] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__InJulia__1SHOT__20231227_185941__955.json | 50.0 | missing | missing | missing | |
| 495 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | InJulia | 1SHOT | true | true | 4 | 20231227_185955__635 | 0 | 0.00526416 | 14.1964 | 0 | [80, 624] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__InJulia__1SHOT__20231227_185955__635.json | 50.0 | missing | missing | missing | |
| 496 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium--optim | InJulia | 1SHOT | true | true | 4 | 20231215_185453__736 | 0 | 0.0 | 48.8452 | 0 | [80, 533] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__InJulia__1SHOT__20231215_185453__736.json | 50.0 | 0.9 | missing | 0.3 | |
| 497 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231213_211946__682 | 0 | 0.00159413 | 3.55726 | 0 | [120, 157] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231213_211946__682.json | 50.0 | missing | missing | missing | |
| 498 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231225_182622__443 | 0 | 0.00428001 | 14.2355 | 0 | [120, 489] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_182622__443.json | 25.0 | missing | missing | missing | |
| 499 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231225_182633__627 | 1 | 0.00212807 | 11.0042 | 1 | [120, 223] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_182633__627.json | 62.5 | missing | missing | missing | |
| 500 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_185924__792 | 0 | 0.00200672 | 4.77038 | 0 | [120, 208] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_185924__792.json | 50.0 | missing | missing | missing | |
| 501 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231227_185928__482 | 0 | 0.00185301 | 4.34376 | 0 | [120, 189] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_185928__482.json | 25.0 | missing | missing | missing | |
| 502 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium--optim | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231215_185404__384 | 0 | 0.0 | 49.8923 | 0 | [120, 503] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231215_185404__384.json | 50.0 | 0.9 | missing | 0.3 | |
| 503 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231213_211943__163 | 0 | 0.00533483 | 13.0198 | 0 | [247, 577] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231213_211943__163.json | 50.0 | missing | missing | missing | |
| 504 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231225_182554__633 | 1 | 0.00436403 | 10.3735 | 1 | [247, 457] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_182554__633.json | 62.5 | missing | missing | missing | |
| 505 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231225_182607__384 | 1 | 0.00544809 | 13.4233 | 1 | [247, 591] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_182607__384.json | 62.5 | missing | missing | missing | |
| 506 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231227_185906__307 | 0 | 0.00455819 | 18.0558 | 0 | [247, 481] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_185906__307.json | 25.0 | missing | missing | missing | |
| 507 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_185919__246 | 1 | 0.0049546 | 12.2767 | 1 | [247, 530] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_185919__246.json | 62.5 | missing | missing | missing | |
| 508 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium--optim | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231215_185314__737 | 1 | 0.0 | 54.4364 | 1 | [247, 558] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231215_185314__737.json | 62.5 | 0.9 | missing | 0.3 | |
| 509 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20231213_212041__910 | 0 | 0.00417302 | 8.84483 | 0 | [383, 388] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231213_212041__910.json | 25.0 | missing | missing | missing | |
| 510 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231225_182848__140 | 4 | 0.00634114 | 14.9666 | 4 | [383, 656] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_182848__140.json | 100.0 | missing | missing | missing | |
| 511 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20231225_182913__940 | 0 | 0.0048364 | 24.571 | 0 | [383, 470] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_182913__940.json | 25.0 | missing | missing | missing | |
| 512 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_190118__399 | 0 | 0.00716632 | 40.7137 | 0 | [383, 758] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_190118__399.json | 50.0 | missing | missing | missing | |
| 513 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_190139__481 | 1 | 0.00561304 | 19.9824 | 4 | [383, 566] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_190139__481.json | 81.25 | missing | missing | missing | |
| 514 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium--optim | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231215_185731__768 | 1 | 0.0 | 69.5175 | 1 | [383, 672] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231215_185731__768.json | 62.5 | 0.9 | missing | 0.3 | |
| 515 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 4 | 20231213_212032__321 | 1 | 0.00744137 | 18.2389 | 1 | [380, 793] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231213_212032__321.json | 62.5 | missing | missing | missing | |
| 516 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 4 | 20231225_182753__192 | 1 | 0.00876813 | 33.6452 | 4 | [380, 957] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_182753__192.json | 81.25 | missing | missing | missing | |
| 517 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 4 | 20231225_182833__451 | 0 | 0.00689934 | 38.9216 | 0 | [380, 726] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_182833__451.json | 50.0 | missing | missing | missing | |
| 518 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | JuliaRecapTask | 1SHOT | true | false | 4 | 20231227_190021__829 | 0 | 0.00661619 | 25.3909 | 0 | [380, 691] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_190021__829.json | 25.0 | missing | missing | missing | |
| 519 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium | JuliaRecapTask | 1SHOT | false | false | 4 | 20231227_190038__274 | 0 | 0.0068508 | 16.7781 | 0 | [380, 720] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_190038__274.json | 0.0 | missing | missing | missing | |
| 520 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-medium--optim | JuliaRecapTask | 1SHOT | true | false | 4 | 20231215_185621__852 | 0 | 0.0 | 62.9579 | 0 | [380, 630] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231215_185621__852.json | 25.0 | 0.9 | missing | 0.3 | |
| 521 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | AsIs | 1SHOT | false | false | 4 | 20231213_211915__140 | 0 | 0.000723646 | 4.94191 | 0 | [78, 347] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__AsIs__1SHOT__20231213_211915__140.json | 0.0 | missing | missing | missing | |
| 522 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | AsIs | 1SHOT | false | false | 4 | 20231225_182508__518 | 0 | 0.000558746 | 3.65051 | 0 | [78, 262] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__AsIs__1SHOT__20231225_182508__518.json | 0.0 | missing | missing | missing | |
| 523 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | AsIs | 1SHOT | false | false | 4 | 20231225_182510__878 | 0 | 0.000337586 | 2.28613 | 0 | [78, 148] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__AsIs__1SHOT__20231225_182510__878.json | 0.0 | missing | missing | missing | |
| 524 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small--optim | AsIs | 1SHOT | false | false | 4 | 20231215_185203__466 | 0 | 0.0 | 6.00841 | 0 | [78, 456] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__AsIs__1SHOT__20231215_185203__466.json | 0.0 | 0.9 | missing | 0.3 | |
| 525 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | InJulia | 1SHOT | true | true | 4 | 20231213_211910__411 | 0 | 0.00038738 | 2.3926 | 0 | [80, 173] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__InJulia__1SHOT__20231213_211910__411.json | 50.0 | missing | missing | missing | |
| 526 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | InJulia | 1SHOT | true | true | 4 | 20231225_182456__470 | 0 | 0.00078896 | 6.5607 | 0 | [80, 380] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__InJulia__1SHOT__20231225_182456__470.json | 50.0 | missing | missing | missing | |
| 527 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | InJulia | 1SHOT | true | true | 4 | 20231225_182504__140 | 0 | 0.00104698 | 7.50162 | 0 | [80, 513] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__InJulia__1SHOT__20231225_182504__140.json | 50.0 | missing | missing | missing | |
| 528 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | InJulia | 1SHOT | true | true | 4 | 20231227_185802__345 | 0 | 0.00079478 | 5.18798 | 0 | [80, 383] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__InJulia__1SHOT__20231227_185802__345.json | 50.0 | missing | missing | missing | |
| 529 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | InJulia | 1SHOT | true | true | 4 | 20231227_185810__769 | 4 | 0.00121964 | 8.10022 | 4 | [80, 602] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__InJulia__1SHOT__20231227_185810__769.json | 100.0 | missing | missing | missing | |
| 530 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small--optim | InJulia | 1SHOT | true | true | 4 | 20231215_185157__114 | 1 | 0.0 | 2.28522 | 4 | [80, 167] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__InJulia__1SHOT__20231215_185157__114.json | 81.25 | 0.9 | missing | 0.3 | |
| 531 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231213_211907__584 | 0 | 0.000905374 | 5.84142 | 0 | [122, 426] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231213_211907__584.json | 50.0 | missing | missing | missing | |
| 532 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231225_182446__848 | 0 | 0.000410674 | 2.912 | 0 | [122, 171] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_182446__848.json | 50.0 | missing | missing | missing | |
| 533 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231225_182450__853 | 0 | 0.000546474 | 4.1522 | 0 | [122, 241] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_182450__853.json | 25.0 | missing | missing | missing | |
| 534 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231227_185754__619 | 0 | 0.000336954 | 1.93832 | 0 | [122, 133] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_185754__619.json | 0.0 | missing | missing | missing | |
| 535 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_185757__894 | 0 | 0.000445594 | 2.6964 | 0 | [122, 189] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_185757__894.json | 50.0 | missing | missing | missing | |
| 536 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small--optim | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231215_185155__726 | 0 | 0.0 | 2.53541 | 0 | [122, 182] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231215_185155__726.json | 25.0 | 0.9 | missing | 0.3 | |
| 537 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231213_211901__476 | 4 | 0.000785783 | 4.46199 | 4 | [249, 322] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231213_211901__476.json | 100.0 | missing | missing | missing | |
| 538 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231225_182433__781 | 0 | 0.000289143 | 1.09273 | 0 | [249, 66] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_182433__781.json | 0.0 | missing | missing | missing | |
| 539 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231225_182443__979 | 1 | 0.00149582 | 9.46059 | 1 | [249, 688] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_182443__979.json | 62.5 | missing | missing | missing | |
| 540 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_185745__664 | 4 | 0.000533583 | 2.76923 | 4 | [249, 192] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_185745__664.json | 100.0 | missing | missing | missing | |
| 541 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_185752__474 | 0 | 0.00106902 | 6.36944 | 0 | [249, 468] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_185752__474.json | 50.0 | missing | missing | missing | |
| 542 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small--optim | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231215_185152__385 | 4 | 0.0 | 2.55497 | 4 | [249, 182] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231215_185152__385.json | 100.0 | 0.9 | missing | 0.3 | |
| 543 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231213_211929__988 | 0 | 0.00116284 | 6.49723 | 0 | [388, 470] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231213_211929__988.json | 50.0 | missing | missing | missing | |
| 544 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231225_182533__532 | 1 | 0.00128506 | 9.09675 | 4 | [388, 533] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_182533__532.json | 81.25 | missing | missing | missing | |
| 545 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231225_182543__859 | 1 | 0.00160322 | 9.95021 | 4 | [388, 697] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_182543__859.json | 81.25 | missing | missing | missing | |
| 546 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_185840__684 | 1 | 0.0011706 | 14.1279 | 1 | [388, 474] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_185840__684.json | 62.5 | missing | missing | missing | |
| 547 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_185848__682 | 1 | 0.0012191 | 7.93456 | 4 | [388, 499] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_185848__682.json | 81.25 | missing | missing | missing | |
| 548 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small--optim | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231215_185219__824 | 1 | 0.0 | 8.89169 | 4 | [388, 661] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231215_185219__824.json | 81.25 | 0.9 | missing | 0.3 | |
| 549 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | JuliaRecapTask | 1SHOT | true | true | 4 | 20231213_211923__707 | 1 | 0.00126242 | 7.23486 | 4 | [386, 522] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231213_211923__707.json | 81.25 | missing | missing | missing | |
| 550 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | JuliaRecapTask | 1SHOT | true | true | 4 | 20231225_182517__784 | 4 | 0.00120228 | 6.73199 | 4 | [386, 491] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_182517__784.json | 100.0 | missing | missing | missing | |
| 551 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | JuliaRecapTask | 1SHOT | true | true | 4 | 20231225_182523__686 | 0 | 0.0011499 | 6.31996 | 0 | [386, 464] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_182523__686.json | 50.0 | missing | missing | missing | |
| 552 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_185819__831 | 4 | 0.00141374 | 8.33438 | 4 | [386, 600] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_185819__831.json | 100.0 | missing | missing | missing | |
| 553 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_185826__516 | 1 | 0.00116542 | 6.73793 | 4 | [386, 472] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_185826__516.json | 81.25 | missing | missing | missing | |
| 554 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-small--optim | JuliaRecapTask | 1SHOT | true | true | 4 | 20231215_185210__709 | 1 | 0.0 | 7.31005 | 4 | [386, 541] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231215_185210__709.json | 81.25 | 0.9 | missing | 0.3 | |
| 555 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | AsIs | 1SHOT | false | false | 4 | 20231213_211836__740 | 0 | 0.000167205 | 4.9598 | 0 | [78, 345] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__AsIs__1SHOT__20231213_211836__740.json | 0.0 | missing | missing | missing | |
| 556 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | AsIs | 1SHOT | false | false | 4 | 20231225_182406__850 | 0 | 0.000189402 | 3.68593 | 0 | [78, 394] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__AsIs__1SHOT__20231225_182406__850.json | 0.0 | missing | missing | missing | |
| 557 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | AsIs | 1SHOT | false | false | 4 | 20231225_182413__581 | 0 | 0.000112845 | 6.94135 | 0 | [78, 225] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__AsIs__1SHOT__20231225_182413__581.json | 0.0 | missing | missing | missing | |
| 558 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny--optim | AsIs | 1SHOT | false | false | 4 | 20231215_185140__671 | 0 | 0.0 | 4.69401 | 0 | [78, 342] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__AsIs__1SHOT__20231215_185140__671.json | 0.0 | 0.9 | missing | 0.3 | |
| 559 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | InJulia | 1SHOT | true | true | 4 | 20231213_211831__415 | 0 | 0.000240871 | 6.20523 | 0 | [80, 507] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__InJulia__1SHOT__20231213_211831__415.json | 50.0 | missing | missing | missing | |
| 560 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | InJulia | 1SHOT | false | false | 4 | 20231225_182359__535 | 0 | 0.00016069 | 3.04137 | 0 | [80, 330] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__InJulia__1SHOT__20231225_182359__535.json | 0.0 | missing | missing | missing | |
| 561 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | InJulia | 1SHOT | false | false | 4 | 20231225_182402__539 | 0 | 0.000166579 | 3.17015 | 0 | [80, 343] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__InJulia__1SHOT__20231225_182402__539.json | 0.0 | missing | missing | missing | |
| 562 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | InJulia | 1SHOT | true | true | 4 | 20231227_185720__596 | 0 | 0.000132604 | 2.39782 | 0 | [80, 268] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__InJulia__1SHOT__20231227_185720__596.json | 50.0 | missing | missing | missing | |
| 563 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | InJulia | 1SHOT | true | true | 4 | 20231227_185724__815 | 0 | 0.000182887 | 3.36856 | 0 | [80, 379] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__InJulia__1SHOT__20231227_185724__815.json | 50.0 | missing | missing | missing | |
| 564 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny--optim | InJulia | 1SHOT | false | false | 4 | 20231215_185135__484 | 0 | 0.0 | 5.18198 | 0 | [80, 627] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__InJulia__1SHOT__20231215_185135__484.json | 0.0 | 0.9 | missing | 0.3 | |
| 565 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231213_211825__351 | 0 | 0.00010768 | 2.30395 | 0 | [122, 200] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231213_211825__351.json | 0.0 | missing | missing | missing | |
| 566 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231225_182354__663 | 0 | 9.2731e-5 | 1.64232 | 0 | [122, 167] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_182354__663.json | 50.0 | missing | missing | missing | |
| 567 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231225_182356__847 | 0 | 9.6808e-5 | 1.73592 | 0 | [122, 176] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_182356__847.json | 0.0 | missing | missing | missing | |
| 568 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_185715__238 | 0 | 8.0047e-5 | 1.37226 | 0 | [122, 139] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_185715__238.json | 50.0 | missing | missing | missing | |
| 569 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_185717__504 | 1 | 9.4996e-5 | 1.61793 | 4 | [122, 172] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_185717__504.json | 81.25 | missing | missing | missing | |
| 570 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny--optim | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231215_185130__208 | 0 | 0.0 | 1.32679 | 0 | [122, 152] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231215_185130__208.json | 0.0 | 0.9 | missing | 0.3 | |
| 571 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231213_211823__823 | 0 | 0.000220137 | 7.76777 | 0 | [249, 409] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231213_211823__823.json | 50.0 | missing | missing | missing | |
| 572 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231225_182345__966 | 0 | 0.000165324 | 12.9214 | 0 | [249, 288] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_182345__966.json | 25.0 | missing | missing | missing | |
| 573 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231225_182352__233 | 0 | 0.000346524 | 6.40634 | 0 | [249, 688] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_182352__233.json | 50.0 | missing | missing | missing | |
| 574 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_185705__421 | 0 | 0.000141315 | 8.26577 | 0 | [249, 235] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_185705__421.json | 50.0 | missing | missing | missing | |
| 575 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231227_185714__907 | 0 | 0.000357396 | 8.86204 | 0 | [249, 712] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_185714__907.json | 0.0 | missing | missing | missing | |
| 576 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny--optim | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231215_185128__378 | 0 | 0.0 | 7.10832 | 0 | [249, 576] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231215_185128__378.json | 50.0 | 0.9 | missing | 0.3 | |
| 577 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231213_211857__510 | 0 | 0.000322496 | 11.8551 | 0 | [388, 592] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231213_211857__510.json | 50.0 | missing | missing | missing | |
| 578 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231225_182426__721 | 0 | 0.000265418 | 4.14092 | 0 | [388, 466] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_182426__721.json | 0.0 | missing | missing | missing | |
| 579 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231225_182432__773 | 0 | 0.000349676 | 5.69536 | 0 | [388, 652] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_182432__773.json | 50.0 | missing | missing | missing | |
| 580 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231227_185738__544 | 0 | 0.000269495 | 4.23682 | 0 | [388, 475] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_185738__544.json | 0.0 | missing | missing | missing | |
| 581 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231227_185743__180 | 0 | 0.000280367 | 4.50207 | 0 | [388, 499] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_185743__180.json | 0.0 | missing | missing | missing | |
| 582 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny--optim | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231215_185149__671 | 0 | 0.0 | 4.8533 | 0 | [388, 562] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231215_185149__671.json | 50.0 | 0.9 | missing | 0.3 | |
| 583 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | JuliaRecapTask | 1SHOT | false | false | 4 | 20231213_211845__155 | 0 | 0.000343507 | 8.37571 | 0 | [386, 639] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231213_211845__155.json | 0.0 | missing | missing | missing | |
| 584 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 4 | 20231225_182417__578 | 1 | 0.000252001 | 3.89883 | 1 | [386, 437] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_182417__578.json | 62.5 | missing | missing | missing | |
| 585 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 4 | 20231225_182422__950 | 0 | 0.000296848 | 4.7298 | 0 | [386, 536] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_182422__950.json | 50.0 | missing | missing | missing | |
| 586 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_185726__601 | 0 | 0.000182239 | 2.59637 | 0 | [386, 283] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_185726__601.json | 50.0 | missing | missing | missing | |
| 587 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_185733__597 | 1 | 0.000380653 | 6.51361 | 1 | [386, 721] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_185733__597.json | 62.5 | missing | missing | missing | |
| 588 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral-tiny--optim | JuliaRecapTask | 1SHOT | true | true | 4 | 20231215_185144__687 | 1 | 0.0 | 3.95189 | 4 | [386, 463] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231215_185144__687.json | 81.25 | 0.9 | missing | 0.3 | |
| 589 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 4 | 20231219_210856__825 | 0 | 0.0 | 10.9652 | 0 | [66, 328] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_210856__825.json | 0.0 | missing | missing | missing | |
| 590 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 4 | 20231219_210913__608 | 0 | 0.0 | 16.839 | 0 | [1, 512] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_210913__608.json | 0.0 | missing | missing | missing | |
| 591 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 4 | 20231219_210924__328 | 0 | 0.0 | 11.4533 | 0 | [1, 357] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_210924__328.json | 0.0 | missing | missing | missing | |
| 592 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 4 | 20231224_223326__499 | 0 | 0.0 | 11.3622 | 0 | [76, 284] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231224_223326__499.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 593 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 4 | 20231224_223329__487 | 0 | 0.0 | 2.51219 | 0 | [76, 53] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231224_223329__487.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 594 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 4 | 20231219_210835__100 | 0 | 0.0 | 17.0624 | 0 | [1, 518] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_210835__100.json | 25.0 | missing | missing | missing | |
| 595 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231219_210845__989 | 0 | 0.0 | 9.95985 | 0 | [1, 313] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_210845__989.json | 50.0 | missing | missing | missing | |
| 596 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231224_223313__643 | 0 | 0.0 | 4.2227 | 0 | [79, 98] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231224_223313__643.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 597 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231224_223315__988 | 0 | 0.0 | 2.13813 | 0 | [79, 43] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231224_223315__988.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 598 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231226_210906__347 | 0 | 0.0 | 5.06874 | 0 | [79, 120] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231226_210906__347.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 599 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231219_210754__275 | 0 | 0.0 | 8.08097 | 0 | [1, 253] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_210754__275.json | 50.0 | missing | missing | missing | |
| 600 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231219_210803__582 | 0 | 0.0 | 8.46745 | 0 | [1, 265] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_210803__582.json | 50.0 | missing | missing | missing | |
| 601 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231224_223303__394 | 0 | 0.0 | 3.4628 | 0 | [121, 73] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_223303__394.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 602 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_223308__843 | 0 | 0.0 | 5.14864 | 0 | [121, 117] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_223308__843.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 603 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231226_210901__269 | 0 | 0.0 | 2.93179 | 0 | [121, 59] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_210901__269.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 604 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231219_210716__682 | 0 | 0.0 | 12.2902 | 0 | [1, 362] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_210716__682.json | 50.0 | missing | missing | missing | |
| 605 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231219_210737__395 | 0 | 0.0 | 20.7773 | 0 | [1, 589] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_210737__395.json | 0.0 | missing | missing | missing | |
| 606 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231224_223243__927 | 0 | 0.0 | 22.7312 | 0 | [248, 403] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_223243__927.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 607 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_223300__344 | 0 | 0.0 | 17.1518 | 0 | [248, 403] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_223300__344.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 608 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231226_210858__502 | 0 | 0.0 | 23.0942 | 0 | [248, 417] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_210858__502.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 609 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20231219_211104__526 | 0 | 0.0 | 22.2904 | 0 | [1, 604] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_211104__526.json | 25.0 | missing | missing | missing | |
| 610 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231219_211126__448 | 0 | 0.0 | 22.0555 | 0 | [1, 598] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_211126__448.json | 50.0 | missing | missing | missing | |
| 611 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_223403__299 | 0 | 0.0 | 10.0877 | 0 | [388, 202] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_223403__299.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 612 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_223422__815 | 0 | 0.0 | 19.8235 | 0 | [388, 442] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_223422__815.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 613 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231226_210941__289 | 0 | 0.0 | 19.1932 | 0 | [388, 426] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_210941__289.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 614 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 4 | 20231219_211003__573 | 0 | 0.0 | 20.9196 | 0 | [1, 570] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_211003__573.json | 25.0 | missing | missing | missing | |
| 615 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231219_211021__698 | 0 | 0.0 | 17.9365 | 0 | [1, 495] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_211021__698.json | 50.0 | missing | missing | missing | |
| 616 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 4 | 20231224_223342__907 | 0 | 0.0 | 12.7121 | 0 | [386, 267] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_223342__907.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 617 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_223352__876 | 0 | 0.0 | 10.7043 | 0 | [386, 217] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_223352__876.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 618 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231226_210922__804 | 0 | 0.0 | 15.3031 | 0 | [386, 331] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_210922__804.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 619 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 4 | 20231227_222026__914 | 0 | 0.0 | 16.07 | 0 | [78, 472] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_222026__914.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 620 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | false | false | 4 | 20231227_222038__663 | 0 | 0.0 | 11.5974 | 0 | [78, 337] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_222038__663.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 621 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 4 | 20231227_222054__868 | 0 | 0.0 | 15.346 | 0 | [78, 450] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_222054__868.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 622 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 4 | 20231227_222111__322 | 1 | 0.0 | 17.1671 | 4 | [78, 506] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_222111__322.json | 81.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 623 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | false | 4 | 20231227_222126__896 | 0 | 0.0 | 14.996 | 0 | [78, 444] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_222126__896.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 624 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231227_221944__542 | 0 | 0.0 | 5.80932 | 0 | [120, 170] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_221944__542.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 625 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231227_221950__302 | 0 | 0.0 | 5.80134 | 0 | [120, 162] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_221950__302.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 626 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_221957__826 | 0 | 0.0 | 6.68358 | 0 | [120, 190] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_221957__826.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 627 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231227_222002__843 | 0 | 0.0 | 5.21487 | 0 | [120, 147] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_222002__843.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 628 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231227_222010__184 | 0 | 0.0 | 8.17628 | 0 | [120, 229] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_222010__184.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 629 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231227_221828__501 | 0 | 0.0 | 15.8588 | 0 | [247, 282] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_221828__501.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 630 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_221851__970 | 0 | 0.0 | 22.7724 | 0 | [247, 671] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_221851__970.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 631 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_221906__854 | 0 | 0.0 | 14.1624 | 0 | [247, 411] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_221906__854.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 632 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231227_221917__230 | 0 | 0.0 | 10.9461 | 0 | [247, 310] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_221917__230.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 633 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_221938__421 | 1 | 0.0 | 21.5188 | 4 | [247, 628] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_221938__421.json | 81.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 634 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_222348__467 | 1 | 0.0 | 32.3177 | 1 | [387, 887] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_222348__467.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 635 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_222400__440 | 0 | 0.0 | 11.3898 | 0 | [387, 298] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_222400__440.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 636 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231227_222415__651 | 0 | 0.0 | 14.8602 | 0 | [387, 396] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_222415__651.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 637 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231227_222427__364 | 0 | 0.0 | 12.4823 | 0 | [387, 320] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_222427__364.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 638 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231227_222440__895 | 0 | 0.0 | 12.3122 | 0 | [387, 316] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_222440__895.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 639 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_222152__482 | 0 | 0.0 | 25.6324 | 0 | [385, 687] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_222152__482.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 640 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | false | false | 4 | 20231227_222217__230 | 0 | 0.0 | 25.1869 | 0 | [385, 676] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_222217__230.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 641 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_222240__251 | 0 | 0.0 | 22.8827 | 0 | [385, 616] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_222240__251.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 642 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 4 | 20231227_222259__444 | 0 | 0.0 | 18.7928 | 0 | [385, 502] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_222259__444.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 643 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_222315__866 | 0 | 0.0 | 16.2574 | 0 | [385, 422] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_222315__866.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 644 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | false | false | 4 | 20231227_222725__923 | 0 | 0.0 | 24.5615 | 0 | [78, 606] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_222725__923.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 645 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231227_222742__187 | 0 | 0.0 | 16.9818 | 0 | [78, 425] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_222742__187.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 646 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231227_222753__392 | 0 | 0.0 | 10.863 | 0 | [78, 269] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_222753__392.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 647 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231227_222812__308 | 0 | 0.0 | 18.6186 | 0 | [78, 466] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_222812__308.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 648 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 4 | 20231227_222823__362 | 0 | 0.0 | 10.7015 | 0 | [78, 265] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_222823__362.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 649 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_222627__306 | 0 | 0.0 | 8.62802 | 0 | [120, 204] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_222627__306.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 650 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_222633__642 | 1 | 0.0 | 5.97485 | 1 | [120, 136] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_222633__642.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 651 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_222642__973 | 0 | 0.0 | 7.44011 | 0 | [120, 171] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_222642__973.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 652 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_222649__359 | 0 | 0.0 | 7.05934 | 0 | [120, 164] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_222649__359.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 653 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_222700__372 | 0 | 0.0 | 11.288 | 0 | [120, 272] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_222700__372.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 654 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_222514__189 | 0 | 0.0 | 34.5322 | 0 | [247, 689] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_222514__189.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 655 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_222531__741 | 0 | 0.0 | 16.9664 | 0 | [247, 392] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_222531__741.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 656 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_222554__403 | 0 | 0.0 | 22.0129 | 0 | [247, 515] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_222554__403.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 657 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231227_222605__249 | 0 | 0.0 | 11.5167 | 0 | [247, 257] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_222605__249.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 658 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_222618__692 | 0 | 0.0 | 12.4848 | 0 | [247, 281] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_222618__692.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 659 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231227_223106__197 | 0 | 0.0 | 24.4211 | 0 | [387, 550] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_223106__197.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 660 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_223116__630 | 0 | 0.0 | 10.0863 | 0 | [387, 200] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_223116__630.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 661 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_223148__194 | 0 | 0.0 | 31.7965 | 0 | [387, 724] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_223148__194.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 662 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_223213__318 | 0 | 0.0 | 24.8027 | 0 | [387, 559] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_223213__318.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 663 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_223239__952 | 0 | 0.0 | 25.5192 | 0 | [387, 576] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_223239__952.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 664 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 4 | 20231227_222852__562 | 0 | 0.0 | 29.1068 | 0 | [385, 661] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_222852__562.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 665 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_222918__108 | 0 | 0.0 | 26.2944 | 0 | [385, 595] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_222918__108.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 666 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_222945__681 | 1 | 0.0 | 26.8597 | 2 | [385, 608] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_222945__681.json | 68.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 667 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_223017__128 | 0 | 0.0 | 31.4353 | 1 | [385, 716] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_223017__128.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 668 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 4 | 20231227_223042__330 | 0 | 0.0 | 24.3946 | 0 | [385, 549] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_223042__330.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 669 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | false | false | 4 | 20231226_115434__742 | 0 | 0.0 | 22.4286 | 0 | [75, 410] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_115434__742.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 670 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | false | false | 4 | 20231226_115500__655 | 0 | 0.0 | 26.2267 | 0 | [75, 480] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_115500__655.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 671 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | false | 4 | 20231226_115349__339 | 0 | 0.0 | 32.2254 | 0 | [78, 584] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_115349__339.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 672 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 4 | 20231226_115411__684 | 0 | 0.0 | 21.9377 | 0 | [78, 401] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_115411__684.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 673 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | false | 4 | 20231226_211310__750 | 0 | 0.0 | 24.3564 | 0 | [78, 448] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_211310__750.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 674 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231226_115305__779 | 0 | 0.0 | 4.33453 | 0 | [120, 67] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_115305__779.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 675 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231226_115316__941 | 0 | 0.0 | 11.313 | 0 | [120, 196] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_115316__941.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 676 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231226_211246__297 | 0 | 0.0 | 17.5896 | 0 | [120, 318] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_211246__297.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 677 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231226_115225__918 | 0 | 0.0 | 34.5511 | 0 | [247, 392] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_115225__918.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 678 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231226_115300__231 | 0 | 0.0 | 32.3957 | 3 | [247, 557] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_115300__231.json | 68.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 679 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231226_211228__845 | 0 | 0.0 | 28.7122 | 0 | [247, 320] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_211228__845.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 680 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231226_115621__498 | 1 | 0.0 | 23.6615 | 3 | [387, 390] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_115621__498.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 681 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231226_115655__171 | 0 | 0.0 | 33.4395 | 0 | [387, 561] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_115655__171.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 682 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231226_211434__485 | 0 | 0.0 | 47.8741 | 0 | [387, 821] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_211434__485.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 683 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 4 | 20231226_115521__407 | 0 | 0.0 | 21.3724 | 0 | [385, 349] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_115521__407.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 684 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 4 | 20231226_115557__452 | 0 | 0.0 | 34.8083 | 0 | [385, 589] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_115557__452.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 685 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 4 | 20231226_211346__544 | 0 | 0.0 | 35.343 | 0 | [385, 602] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_211346__544.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 686 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | false | false | 4 | 20231227_100331__531 | 0 | 0.0 | 43.6627 | 0 | [83, 237] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_100331__531.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 687 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231227_100413__256 | 1 | 0.0 | 42.1394 | 3 | [83, 246] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_100413__256.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 688 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231227_100451__992 | 0 | 0.0 | 37.2126 | 0 | [83, 216] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_100451__992.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 689 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | false | false | 4 | 20231227_140857__638 | 0 | 0.0 | 37.0819 | 0 | [83, 214] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_140857__638.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 690 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231227_140955__313 | 0 | 0.0 | 57.9009 | 0 | [83, 340] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_140955__313.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 691 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231227_100136__353 | 0 | 0.0 | 36.6525 | 0 | [122, 200] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_100136__353.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 692 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_100204__483 | 1 | 0.0 | 28.5557 | 1 | [122, 149] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_100204__483.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 693 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_100246__675 | 1 | 0.0 | 41.6302 | 1 | [122, 224] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_100246__675.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 694 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_140807__987 | 0 | 0.0 | 32.9359 | 0 | [122, 176] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_140807__987.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 695 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231227_140820__166 | 0 | 0.0 | 11.7228 | 0 | [122, 53] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_140820__166.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 696 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_095754__625 | 1 | 0.0 | 67.743 | 1 | [248, 343] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_095754__625.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 697 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_095949__337 | 1 | 0.0 | 114.657 | 4 | [248, 645] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_095949__337.json | 81.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 698 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_100059__994 | 1 | 0.0 | 68.9864 | 1 | [248, 369] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_100059__994.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 699 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_140627__899 | 0 | 0.0 | 93.4955 | 0 | [248, 505] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_140627__899.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 700 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231227_140734__156 | 0 | 0.0 | 66.3465 | 0 | [248, 346] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_140734__156.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 701 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_100858__537 | 0 | 0.0 | 59.9875 | 0 | [396, 296] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_100858__537.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 702 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_101043__252 | 0 | 0.0 | 105.007 | 0 | [396, 557] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_101043__252.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 703 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_101207__404 | 0 | 0.0 | 83.6938 | 3 | [396, 434] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_101207__404.json | 68.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 704 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231227_141410__970 | 0 | 0.0 | 79.4301 | 0 | [396, 407] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_141410__970.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 705 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20231227_141523__762 | 0 | 0.0 | 73.4373 | 0 | [396, 372] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_141523__762.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 706 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_100620__397 | 0 | 0.0 | 89.4829 | 0 | [394, 468] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_100620__397.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 707 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_100644__464 | 0 | 0.0 | 23.6161 | 1 | [394, 80] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_100644__464.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 708 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 4 | 20231227_100757__261 | 0 | 0.0 | 73.2466 | 0 | [394, 374] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_100757__261.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 709 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_141057__871 | 0 | 0.0 | 61.9135 | 0 | [394, 305] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_141057__871.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 710 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 4 | 20231227_141250__755 | 0 | 0.0 | 112.798 | 0 | [394, 598] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_141250__755.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 711 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 4 | 20231219_211319__315 | 0 | 0.0 | 12.2669 | 0 | [66, 366] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231219_211319__315.json | 0.0 | missing | missing | missing | |
| 712 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 4 | 20231219_211334__226 | 0 | 0.0 | 15.4507 | 0 | [1, 473] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231219_211334__226.json | 0.0 | missing | missing | missing | |
| 713 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 4 | 20231219_211342__188 | 0 | 0.0 | 8.04381 | 0 | [1, 256] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231219_211342__188.json | 0.0 | missing | missing | missing | |
| 714 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 4 | 20231224_223545__717 | 0 | 0.0 | 9.80986 | 0 | [84, 243] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231224_223545__717.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 715 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 4 | 20231224_223554__453 | 0 | 0.0 | 8.76813 | 0 | [84, 216] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231224_223554__453.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 716 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | false | false | 4 | 20231219_211255__302 | 0 | 0.0 | 11.8809 | 0 | [1, 370] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_211255__302.json | 0.0 | missing | missing | missing | |
| 717 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231219_211306__730 | 0 | 0.0 | 11.3203 | 0 | [1, 353] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_211306__730.json | 50.0 | missing | missing | missing | |
| 718 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | false | false | 4 | 20231224_223527__363 | 0 | 0.0 | 12.7237 | 0 | [87, 318] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231224_223527__363.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 719 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231224_223535__963 | 0 | 0.0 | 7.69095 | 0 | [87, 188] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231224_223535__963.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 720 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231226_211022__802 | 0 | 0.0 | 15.8072 | 0 | [87, 395] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231226_211022__802.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 721 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231219_211221__787 | 0 | 0.0 | 6.76287 | 0 | [1, 213] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_211221__787.json | 50.0 | missing | missing | missing | |
| 722 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231219_211228__869 | 0 | 0.0 | 7.37654 | 0 | [1, 232] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_211228__869.json | 50.0 | missing | missing | missing | |
| 723 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_223507__957 | 0 | 0.0 | 2.61661 | 0 | [129, 47] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_223507__957.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 724 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231224_223515__975 | 0 | 0.0 | 7.6359 | 0 | [129, 178] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_223515__975.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 725 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231226_211006__345 | 0 | 0.0 | 2.62921 | 0 | [129, 47] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_211006__345.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 726 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231219_211156__109 | 0 | 0.0 | 11.5419 | 0 | [1, 341] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_211156__109.json | 0.0 | missing | missing | missing | |
| 727 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231219_211205__741 | 0 | 0.0 | 9.71335 | 0 | [1, 289] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_211205__741.json | 0.0 | missing | missing | missing | |
| 728 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_223445__779 | 1 | 0.0 | 22.1232 | 4 | [256, 368] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_223445__779.json | 81.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 729 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_223504__136 | 0 | 0.0 | 19.2982 | 0 | [256, 455] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_223504__136.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 730 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231226_211003__425 | 0 | 0.0 | 22.0489 | 0 | [256, 375] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_211003__425.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 731 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231219_211525__380 | 0 | 0.0 | 27.843 | 0 | [1, 739] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_211525__380.json | 50.0 | missing | missing | missing | |
| 732 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231219_211541__280 | 0 | 0.0 | 16.5645 | 0 | [1, 459] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_211541__280.json | 50.0 | missing | missing | missing | |
| 733 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_223642__146 | 0 | 0.0 | 18.2173 | 0 | [396, 402] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_223642__146.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 734 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20231224_223657__825 | 0 | 0.0 | 14.948 | 0 | [396, 321] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_223657__825.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 735 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231226_211105__230 | 1 | 0.0 | 20.2328 | 1 | [396, 449] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_211105__230.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 736 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231219_211416__442 | 0 | 0.0 | 14.5984 | 0 | [1, 408] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_211416__442.json | 50.0 | missing | missing | missing | |
| 737 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231219_211438__774 | 0 | 0.0 | 22.1919 | 0 | [1, 602] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_211438__774.json | 50.0 | missing | missing | missing | |
| 738 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_223611__144 | 0 | 0.0 | 16.9902 | 0 | [394, 372] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_223611__144.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 739 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_223624__880 | 0 | 0.0 | 12.7834 | 0 | [394, 268] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_223624__880.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 740 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231226_211044__118 | 0 | 0.0 | 22.7122 | 0 | [394, 509] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_211044__118.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 741 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 4 | 20231213_230409__837 | 0 | 0.0 | 15.3214 | 0 | [66, 452] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231213_230409__837.json | 0.0 | missing | missing | missing | |
| 742 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 4 | 20231224_214337__293 | 0 | 0.0 | 13.6135 | 0 | [83, 437] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231224_214337__293.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 743 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 4 | 20231224_214351__816 | 0 | 0.0 | 13.6648 | 0 | [83, 438] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231224_214351__816.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 744 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | InJulia | 1SHOT | false | false | 4 | 20231213_230354__160 | 0 | 0.0 | 17.6194 | 0 | [82, 516] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231213_230354__160.json | 0.0 | missing | missing | missing | |
| 745 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | InJulia | 1SHOT | true | false | 4 | 20231224_214319__409 | 0 | 0.0 | 6.57793 | 0 | [85, 207] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231224_214319__409.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 746 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 4 | 20231224_214323__877 | 0 | 0.0 | 4.52794 | 0 | [85, 138] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231224_214323__877.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 747 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 4 | 20231226_204611__837 | 0 | 0.0 | 7.6741 | 0 | [85, 242] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231226_204611__837.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 748 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231213_230336__999 | 0 | 0.0 | 7.70955 | 0 | [112, 217] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231213_230336__999.json | 25.0 | missing | missing | missing | |
| 749 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_214309__839 | 0 | 0.0 | 2.98036 | 0 | [127, 80] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231224_214309__839.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 750 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_214312__147 | 0 | 0.0 | 3.011 | 0 | [127, 82] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231224_214312__147.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 751 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231226_204604__802 | 0 | 0.0 | 6.27106 | 0 | [127, 190] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231226_204604__802.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 752 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231213_230329__973 | 0 | 0.0 | 20.1513 | 0 | [239, 520] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231213_230329__973.json | 50.0 | missing | missing | missing | |
| 753 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_214246__900 | 0 | 0.0 | 25.225 | 0 | [254, 599] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231224_214246__900.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 754 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_214306__896 | 1 | 0.0 | 19.6735 | 1 | [254, 591] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231224_214306__896.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 755 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231226_204557__171 | 1 | 0.0 | 21.1393 | 1 | [254, 467] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231226_204557__171.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 756 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231213_230444__378 | 0 | 0.0 | 15.1027 | 0 | [11, 410] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231213_230444__378.json | 50.0 | missing | missing | missing | |
| 757 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_214422__875 | 1 | 0.0 | 13.5487 | 1 | [394, 373] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231224_214422__875.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 758 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_214441__872 | 1 | 0.0 | 18.7365 | 1 | [394, 531] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231224_214441__872.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 759 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231226_204649__756 | 0 | 0.0 | 22.1708 | 0 | [394, 632] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231226_204649__756.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 760 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 4 | 20231213_230429__150 | 0 | 0.0 | 19.049 | 0 | [383, 429] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231213_230429__150.json | 50.0 | missing | missing | missing | |
| 761 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | false | false | 4 | 20231224_214402__954 | 0 | 0.0 | 11.0613 | 0 | [392, 295] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231224_214402__954.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 762 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_214408__963 | 0 | 0.0 | 6.54506 | 0 | [392, 152] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231224_214408__963.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 763 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | false | 4 | 20231226_204626__845 | 0 | 0.0 | 14.8077 | 0 | [392, 410] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231226_204626__845.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 764 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | AsIs | 1SHOT | false | false | 4 | 20231213_231443__118 | 0 | 0.0 | 11.7512 | 0 | [66, 349] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__AsIs__1SHOT__20231213_231443__118.json | 0.0 | missing | missing | missing | |
| 765 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | AsIs | 1SHOT | false | false | 4 | 20231224_220615__755 | 0 | 0.0 | 10.2355 | 0 | [83, 180] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__AsIs__1SHOT__20231224_220615__755.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 766 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | AsIs | 1SHOT | false | false | 4 | 20231224_220635__965 | 0 | 0.0 | 20.3389 | 0 | [83, 368] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__AsIs__1SHOT__20231224_220635__965.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 767 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | InJulia | 1SHOT | true | false | 4 | 20231213_231431__567 | 0 | 0.0 | 12.9547 | 0 | [82, 383] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__InJulia__1SHOT__20231213_231431__567.json | 25.0 | missing | missing | missing | |
| 768 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | InJulia | 1SHOT | true | true | 4 | 20231224_220602__831 | 0 | 0.0 | 7.32448 | 0 | [85, 125] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__InJulia__1SHOT__20231224_220602__831.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 769 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | InJulia | 1SHOT | false | false | 4 | 20231224_220605__366 | 0 | 0.0 | 2.8498 | 0 | [85, 39] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__InJulia__1SHOT__20231224_220605__366.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 770 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | InJulia | 1SHOT | true | true | 4 | 20231226_205626__600 | 0 | 0.0 | 9.0947 | 0 | [85, 159] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__InJulia__1SHOT__20231226_205626__600.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 771 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231213_231418__396 | 0 | 0.0 | 6.75085 | 0 | [112, 188] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231213_231418__396.json | 25.0 | missing | missing | missing | |
| 772 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231224_220534__596 | 0 | 0.0 | 17.8413 | 0 | [125, 316] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231224_220534__596.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 773 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231224_220554__427 | 0 | 0.0 | 20.4519 | 0 | [125, 363] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231224_220554__427.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 774 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231226_205616__701 | 0 | 0.0 | 3.43588 | 0 | [125, 46] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231226_205616__701.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 775 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231213_231412__836 | 0 | 0.0 | 14.2497 | 0 | [239, 361] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231213_231412__836.json | 50.0 | missing | missing | missing | |
| 776 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231224_220511__304 | 0 | 0.0 | 20.7275 | 0 | [252, 167] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231224_220511__304.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 777 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231224_220516__617 | 0 | 0.0 | 5.41993 | 0 | [252, 63] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231224_220516__617.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 778 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231226_205613__710 | 0 | 0.0 | 26.0848 | 0 | [252, 258] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231226_205613__710.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 779 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231213_231529__668 | 0 | 0.0 | 19.5313 | 0 | [11, 528] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231213_231529__668.json | 0.0 | missing | missing | missing | |
| 780 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231224_220741__118 | 0 | 0.0 | 17.4291 | 0 | [389, 255] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231224_220741__118.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 781 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231224_220846__645 | 0 | 0.0 | 64.9873 | 0 | [389, 1045] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231224_220846__645.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 782 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231226_205714__445 | 0 | 0.0 | 15.0533 | 0 | [389, 214] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231226_205714__445.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 783 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaRecapTask | 1SHOT | true | false | 4 | 20231213_231509__775 | 0 | 0.0 | 26.2222 | 0 | [383, 614] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231213_231509__775.json | 25.0 | missing | missing | missing | |
| 784 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 4 | 20231224_220658__519 | 0 | 0.0 | 23.1072 | 0 | [386, 355] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231224_220658__519.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 785 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 4 | 20231224_220724__632 | 0 | 0.0 | 25.2552 | 0 | [386, 392] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231224_220724__632.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 786 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 4 | 20231226_205659__326 | 0 | 0.0 | 33.3249 | 0 | [386, 533] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231226_205659__326.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 787 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 4 | 20231219_211756__703 | 0 | 0.0 | 9.52128 | 0 | [66, 285] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231219_211756__703.json | 0.0 | missing | missing | missing | |
| 788 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 4 | 20231219_211807__959 | 0 | 0.0 | 10.5285 | 0 | [1, 330] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231219_211807__959.json | 0.0 | missing | missing | missing | |
| 789 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 4 | 20231219_211816__211 | 0 | 0.0 | 9.18007 | 0 | [1, 290] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231219_211816__211.json | 0.0 | missing | missing | missing | |
| 790 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 4 | 20231224_223813__946 | 0 | 0.0 | 3.81443 | 0 | [76, 144] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231224_223813__946.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 791 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 4 | 20231224_223817__490 | 0 | 0.0 | 4.40774 | 0 | [76, 168] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231224_223817__490.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 792 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 4 | 20231219_211727__307 | 0 | 0.0 | 9.12611 | 0 | [1, 288] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_211727__307.json | 0.0 | missing | missing | missing | |
| 793 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | true | true | 4 | 20231219_211747__489 | 0 | 0.0 | 20.0043 | 0 | [1, 599] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_211747__489.json | 50.0 | missing | missing | missing | |
| 794 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | true | true | 4 | 20231224_223744__581 | 0 | 0.0 | 4.1468 | 0 | [79, 157] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231224_223744__581.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 795 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 4 | 20231224_223809__375 | 0 | 0.0 | 24.6085 | 0 | [79, 909] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231224_223809__375.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 796 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 4 | 20231226_211146__407 | 0 | 0.0 | 12.8638 | 0 | [79, 491] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231226_211146__407.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 797 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231219_211651__207 | 0 | 0.0 | 10.9584 | 0 | [1, 338] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_211651__207.json | 25.0 | missing | missing | missing | |
| 798 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231219_211657__785 | 0 | 0.0 | 5.71202 | 0 | [1, 181] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_211657__785.json | 50.0 | missing | missing | missing | |
| 799 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231224_223736__319 | 0 | 0.0 | 30.8922 | 0 | [116, 1108] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231224_223736__319.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 800 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231224_223740__779 | 0 | 0.0 | 4.63229 | 0 | [116, 172] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231224_223740__779.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 801 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231226_211133__320 | 0 | 0.0 | 20.2041 | 0 | [116, 747] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_211133__320.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 802 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231219_211612__279 | 0 | 0.0 | 11.5734 | 0 | [1, 342] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_211612__279.json | 50.0 | missing | missing | missing | |
| 803 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231219_211633__586 | 0 | 0.0 | 20.5712 | 0 | [1, 584] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_211633__586.json | 25.0 | missing | missing | missing | |
| 804 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231224_223702__671 | 0 | 0.0 | 5.11684 | 0 | [228, 28] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231224_223702__671.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 805 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231224_223705__847 | 0 | 0.0 | 2.44581 | 0 | [228, 68] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231224_223705__847.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 806 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231226_211113__501 | 0 | 0.0 | 7.95007 | 0 | [228, 147] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_211113__501.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 807 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20231219_211943__286 | 0 | 0.0 | 15.8221 | 0 | [1, 440] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_211943__286.json | 25.0 | missing | missing | missing | |
| 808 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231219_212000__721 | 0 | 0.0 | 16.7187 | 0 | [1, 463] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_212000__721.json | 50.0 | missing | missing | missing | |
| 809 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231224_223854__425 | 0 | 0.0 | 6.44599 | 0 | [368, 199] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231224_223854__425.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 810 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231224_223919__845 | 0 | 0.0 | 24.8188 | 0 | [368, 837] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231224_223919__845.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 811 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231226_211159__406 | 0 | 0.0 | 12.0725 | 0 | [368, 402] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_211159__406.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 812 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | false | 4 | 20231219_211855__981 | 0 | 0.0 | 20.4616 | 0 | [1, 559] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_211855__981.json | 25.0 | missing | missing | missing | |
| 813 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | true | 4 | 20231219_211911__495 | 0 | 0.0 | 15.5511 | 0 | [1, 433] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_211911__495.json | 50.0 | missing | missing | missing | |
| 814 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 4 | 20231224_223824__986 | 0 | 0.0 | 6.9273 | 0 | [365, 212] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231224_223824__986.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 815 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 4 | 20231224_223847__876 | 0 | 0.0 | 23.2036 | 0 | [365, 783] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231224_223847__876.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 816 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 4 | 20231226_211147__233 | 0 | 0.0 | 1.23476 | 0 | [365, 1] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_211147__233.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 817 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 4 | 20231213_231624__647 | 0 | 0.0 | 14.7731 | 0 | [66, 437] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231213_231624__647.json | 0.0 | missing | missing | missing | |
| 818 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 4 | 20231224_221335__148 | 0 | 0.0 | 27.3002 | 0 | [91, 204] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231224_221335__148.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 819 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 4 | 20231224_221426__536 | 0 | 0.0 | 50.8091 | 0 | [91, 393] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231224_221426__536.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 820 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 4 | 20231213_231609__695 | 0 | 0.0 | 15.638 | 0 | [82, 460] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231213_231609__695.json | 50.0 | missing | missing | missing | |
| 821 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 4 | 20231224_221158__254 | 0 | 0.0 | 38.6556 | 0 | [93, 297] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231224_221158__254.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 822 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | InJulia | 1SHOT | false | false | 4 | 20231224_221307__198 | 0 | 0.0 | 68.6903 | 0 | [93, 531] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231224_221307__198.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 823 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | InJulia | 1SHOT | false | false | 4 | 20231226_205944__535 | 0 | 0.0 | 41.8397 | 0 | [93, 319] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231226_205944__535.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 824 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231213_231553__913 | 0 | 0.0 | 7.78618 | 0 | [112, 219] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231213_231553__913.json | 50.0 | missing | missing | missing | |
| 825 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_221033__344 | 1 | 0.0 | 12.0045 | 1 | [133, 72] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231224_221033__344.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 826 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_221119__394 | 1 | 0.0 | 45.5381 | 3 | [133, 341] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231224_221119__394.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 827 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231226_205901__525 | 1 | 0.0 | 39.9638 | 1 | [133, 297] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231226_205901__525.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 828 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231213_231545__744 | 0 | 0.0 | 16.2576 | 0 | [239, 416] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231213_231545__744.json | 25.0 | missing | missing | missing | |
| 829 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_220950__612 | 1 | 0.0 | 63.9363 | 4 | [260, 282] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231224_220950__612.json | 81.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 830 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_221021__792 | 1 | 0.0 | 30.5228 | 4 | [260, 198] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231224_221021__792.json | 81.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 831 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231226_205821__855 | 0 | 0.0 | 66.9719 | 0 | [260, 322] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_205821__855.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 832 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231213_231708__422 | 0 | 0.0 | 21.9283 | 0 | [11, 587] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231213_231708__422.json | 0.0 | missing | missing | missing | |
| 833 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20231224_221730__174 | 0 | 0.0 | 47.3913 | 0 | [397, 305] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231224_221730__174.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 834 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_221757__868 | 1 | 0.0 | 26.3479 | 3 | [397, 141] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231224_221757__868.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 835 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231226_210127__779 | 1 | 0.0 | 46.201 | 3 | [397, 296] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_210127__779.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 836 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | false | false | 4 | 20231213_231646__454 | 0 | 0.0 | 22.4059 | 0 | [383, 518] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231213_231646__454.json | 0.0 | missing | missing | missing | |
| 837 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_221538__112 | 0 | 0.0 | 71.8218 | 0 | [394, 492] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231224_221538__112.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 838 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_221642__393 | 0 | 0.0 | 63.9471 | 0 | [394, 432] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231224_221642__393.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 839 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | false | 4 | 20231226_210041__395 | 0 | 0.0 | 56.9533 | 0 | [394, 379] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231226_210041__395.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 840 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 4 | 20231219_210418__990 | 0 | 0.0 | 17.6966 | 0 | [66, 521] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231219_210418__990.json | 0.0 | missing | missing | missing | |
| 841 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 4 | 20231219_210433__427 | 0 | 0.0 | 14.3834 | 0 | [1, 442] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231219_210433__427.json | 0.0 | missing | missing | missing | |
| 842 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 4 | 20231219_210446__973 | 0 | 0.0 | 13.0265 | 0 | [1, 404] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231219_210446__973.json | 0.0 | missing | missing | missing | |
| 843 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 4 | 20231224_223032__536 | 0 | 0.0 | 17.1781 | 0 | [85, 289] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231224_223032__536.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 844 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 4 | 20231224_223048__343 | 0 | 0.0 | 15.7415 | 0 | [85, 264] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231224_223048__343.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 845 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231219_210350__149 | 0 | 0.0 | 11.3669 | 0 | [1, 355] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_210350__149.json | 50.0 | missing | missing | missing | |
| 846 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231219_210401__529 | 0 | 0.0 | 10.5193 | 0 | [1, 330] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_210401__529.json | 50.0 | missing | missing | missing | |
| 847 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 4 | 20231224_223004__851 | 0 | 0.0 | 16.6118 | 0 | [87, 279] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231224_223004__851.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 848 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231224_223015__980 | 1 | 0.0 | 10.2473 | 1 | [87, 168] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231224_223015__980.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 849 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231226_210750__459 | 1 | 0.0 | 15.9249 | 1 | [87, 267] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231226_210750__459.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 850 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231219_210317__743 | 0 | 0.0 | 5.79113 | 0 | [1, 184] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_210317__743.json | 50.0 | missing | missing | missing | |
| 851 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231219_210326__706 | 0 | 0.0 | 8.97485 | 0 | [1, 280] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_210326__706.json | 50.0 | missing | missing | missing | |
| 852 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_222931__462 | 0 | 0.0 | 12.466 | 0 | [129, 198] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_222931__462.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 853 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231224_222948__793 | 0 | 0.0 | 17.0255 | 0 | [129, 277] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_222948__793.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 854 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231226_210734__789 | 0 | 0.0 | 12.1807 | 0 | [129, 193] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_210734__789.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 855 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231219_210253__101 | 0 | 0.0 | 16.9536 | 0 | [1, 489] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_210253__101.json | 50.0 | missing | missing | missing | |
| 856 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231219_210305__725 | 0 | 0.0 | 11.3949 | 0 | [1, 337] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_210305__725.json | 0.0 | missing | missing | missing | |
| 857 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231224_222859__310 | 0 | 0.0 | 34.2145 | 0 | [256, 388] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_222859__310.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 858 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_222918__565 | 0 | 0.0 | 18.6582 | 0 | [256, 288] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_222918__565.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 859 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231226_210721__583 | 0 | 0.0 | 36.5823 | 0 | [256, 444] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_210721__583.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 860 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231219_210626__363 | 0 | 0.0 | 22.3206 | 0 | [1, 605] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_210626__363.json | 50.0 | missing | missing | missing | |
| 861 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231219_210645__520 | 0 | 0.0 | 19.6091 | 0 | [1, 537] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_210645__520.json | 50.0 | missing | missing | missing | |
| 862 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_223201__488 | 0 | 0.0 | 19.0267 | 0 | [396, 269] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_223201__488.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 863 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_223220__772 | 0 | 0.0 | 18.4085 | 0 | [396, 259] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_223220__772.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 864 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231226_210835__621 | 0 | 0.0 | 18.8388 | 0 | [396, 266] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_210835__621.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 865 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 4 | 20231219_210527__289 | 0 | 0.0 | 23.5755 | 0 | [1, 636] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_210527__289.json | 0.0 | missing | missing | missing | |
| 866 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 4 | 20231219_210539__226 | 0 | 0.0 | 11.8552 | 0 | [1, 335] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_210539__226.json | 0.0 | missing | missing | missing | |
| 867 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 4 | 20231224_223115__169 | 0 | 0.0 | 26.9897 | 0 | [394, 401] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_223115__169.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 868 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 4 | 20231224_223142__961 | 0 | 0.0 | 27.266 | 0 | [394, 406] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_223142__961.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 869 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231226_210816__340 | 0 | 0.0 | 26.5701 | 0 | [394, 394] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_210816__340.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 870 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | AsIs | 1SHOT | false | false | 4 | 20231213_231313__769 | 0 | 0.0 | 19.3212 | 0 | [66, 564] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__AsIs__1SHOT__20231213_231313__769.json | 0.0 | missing | missing | missing | |
| 871 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | AsIs | 1SHOT | false | false | 4 | 20231224_220418__133 | 0 | 0.0 | 5.56169 | 0 | [85, 314] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__AsIs__1SHOT__20231224_220418__133.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 872 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | AsIs | 1SHOT | false | false | 4 | 20231224_220422__822 | 0 | 0.0 | 4.39978 | 0 | [85, 248] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__AsIs__1SHOT__20231224_220422__822.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 873 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | InJulia | 1SHOT | false | false | 4 | 20231213_231254__188 | 0 | 0.0 | 10.1145 | 0 | [82, 299] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__InJulia__1SHOT__20231213_231254__188.json | 0.0 | missing | missing | missing | |
| 874 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | InJulia | 1SHOT | true | true | 4 | 20231224_220404__640 | 0 | 0.0 | 5.89868 | 0 | [88, 333] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__InJulia__1SHOT__20231224_220404__640.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 875 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | InJulia | 1SHOT | false | false | 4 | 20231224_220412__706 | 0 | 0.0 | 7.52091 | 0 | [88, 422] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__InJulia__1SHOT__20231224_220412__706.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 876 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | InJulia | 1SHOT | true | true | 4 | 20231226_205530__757 | 0 | 0.0 | 5.22782 | 0 | [88, 295] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__InJulia__1SHOT__20231226_205530__757.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 877 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231213_231244__318 | 0 | 0.0 | 7.07503 | 0 | [112, 197] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231213_231244__318.json | 50.0 | missing | missing | missing | |
| 878 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_220353__403 | 0 | 0.0 | 4.95009 | 0 | [125, 273] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231224_220353__403.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 879 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231224_220359__789 | 0 | 0.0 | 5.2296 | 0 | [125, 288] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231224_220359__789.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 880 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231226_205524__721 | 0 | 0.0 | 1.82697 | 0 | [125, 92] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231226_205524__721.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 881 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231213_231237__298 | 0 | 0.0 | 20.17 | 0 | [239, 522] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231213_231237__298.json | 50.0 | missing | missing | missing | |
| 882 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_220339__304 | 0 | 0.0 | 15.5642 | 0 | [232, 646] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231224_220339__304.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 883 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231224_220348__593 | 0 | 0.0 | 8.72532 | 0 | [232, 445] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231224_220348__593.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 884 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231226_205523__760 | 0 | 0.0 | 9.48193 | 0 | [232, 348] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231226_205523__760.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 885 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231213_231357__110 | 0 | 0.0 | 19.6635 | 0 | [11, 531] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231213_231357__110.json | 0.0 | missing | missing | missing | |
| 886 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231224_220440__339 | 0 | 0.0 | 5.62358 | 0 | [375, 247] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231224_220440__339.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 887 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231224_220450__615 | 0 | 0.0 | 9.6912 | 0 | [375, 454] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231224_220450__615.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 888 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231226_205547__151 | 0 | 0.0 | 10.674 | 0 | [375, 502] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231226_205547__151.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 889 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 4 | 20231213_231337__268 | 0 | 0.0 | 24.209 | 0 | [383, 563] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231213_231337__268.json | 50.0 | missing | missing | missing | |
| 890 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 4 | 20231224_220429__737 | 0 | 0.0 | 6.85357 | 0 | [373, 311] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231224_220429__737.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 891 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 4 | 20231224_220435__549 | 0 | 0.0 | 5.6292 | 0 | [373, 248] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231224_220435__549.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 892 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 4 | 20231226_205536__902 | 0 | 0.0 | 6.42983 | 0 | [373, 289] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231226_205536__902.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 893 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | AsIs | 1SHOT | false | false | 4 | 20231213_230530__995 | 0 | 0.0 | 13.2135 | 0 | [66, 393] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__AsIs__1SHOT__20231213_230530__995.json | 0.0 | missing | missing | missing | |
| 894 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | AsIs | 1SHOT | false | false | 4 | 20231224_214610__303 | 0 | 0.0 | 5.26193 | 0 | [84, 162] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__AsIs__1SHOT__20231224_214610__303.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 895 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | AsIs | 1SHOT | false | false | 4 | 20231224_214620__667 | 0 | 0.0 | 9.94992 | 0 | [84, 316] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__AsIs__1SHOT__20231224_214620__667.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 896 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | InJulia | 1SHOT | true | true | 4 | 20231213_230516__550 | 0 | 0.0 | 10.7484 | 0 | [82, 319] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__InJulia__1SHOT__20231213_230516__550.json | 50.0 | missing | missing | missing | |
| 897 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | InJulia | 1SHOT | true | true | 4 | 20231224_214559__800 | 0 | 0.0 | 7.97435 | 0 | [87, 251] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__InJulia__1SHOT__20231224_214559__800.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 898 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | InJulia | 1SHOT | true | true | 4 | 20231224_214604__293 | 0 | 0.0 | 5.46209 | 0 | [87, 169] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__InJulia__1SHOT__20231224_214604__293.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 899 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | InJulia | 1SHOT | true | true | 4 | 20231226_204732__983 | 0 | 0.0 | 13.1953 | 0 | [87, 421] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__InJulia__1SHOT__20231226_204732__983.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 900 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231213_230506__910 | 0 | 0.0 | 6.41789 | 0 | [112, 178] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231213_230506__910.json | 25.0 | missing | missing | missing | |
| 901 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_214544__136 | 0 | 0.0 | 15.4212 | 0 | [129, 483] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231224_214544__136.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 902 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_214551__136 | 0 | 0.0 | 6.49753 | 0 | [129, 193] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231224_214551__136.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 903 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231226_204718__917 | 0 | 0.0 | 8.15732 | 0 | [129, 248] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231226_204718__917.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 904 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231213_230459__319 | 0 | 0.0 | 15.1655 | 0 | [239, 385] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231213_230459__319.json | 0.0 | missing | missing | missing | |
| 905 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_214512__971 | 0 | 0.0 | 31.1972 | 1 | [256, 768] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231224_214512__971.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 906 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_214528__433 | 0 | 0.0 | 15.5994 | 0 | [256, 466] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231224_214528__433.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 907 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231226_204710__179 | 0 | 0.0 | 21.5997 | 0 | [256, 486] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231226_204710__179.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 908 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231213_230604__469 | 0 | 0.0 | 11.7843 | 0 | [11, 327] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231213_230604__469.json | 0.0 | missing | missing | missing | |
| 909 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_214704__595 | 0 | 0.0 | 15.7235 | 0 | [396, 439] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231224_214704__595.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 910 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_214712__559 | 0 | 0.0 | 7.48527 | 1 | [396, 182] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231224_214712__559.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 911 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231226_204803__262 | 1 | 0.0 | 19.9672 | 1 | [396, 566] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231226_204803__262.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 912 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 4 | 20231213_230552__351 | 0 | 0.0 | 22.6418 | 0 | [383, 526] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231213_230552__351.json | 50.0 | missing | missing | missing | |
| 913 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_214634__687 | 0 | 0.0 | 13.5355 | 0 | [394, 372] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231224_214634__687.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 914 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_214648__650 | 0 | 0.0 | 14.2957 | 0 | [394, 395] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231224_214648__650.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 915 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 4 | 20231226_204743__236 | 0 | 0.0 | 10.3734 | 0 | [394, 272] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231226_204743__236.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 916 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | AsIs | 1SHOT | false | false | 4 | 20231213_230648__276 | 0 | 0.0 | 12.8264 | 0 | [66, 382] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__AsIs__1SHOT__20231213_230648__276.json | 0.0 | missing | missing | missing | |
| 917 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | AsIs | 1SHOT | false | false | 4 | 20231224_215226__892 | 0 | 0.0 | 67.469 | 0 | [80, 504] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__AsIs__1SHOT__20231224_215226__892.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 918 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | AsIs | 1SHOT | false | false | 4 | 20231224_215303__817 | 0 | 0.0 | 36.9497 | 0 | [80, 273] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__AsIs__1SHOT__20231224_215303__817.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 919 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | InJulia | 1SHOT | false | false | 4 | 20231213_230635__773 | 0 | 0.0 | 17.8071 | 0 | [82, 522] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__InJulia__1SHOT__20231213_230635__773.json | 0.0 | missing | missing | missing | |
| 920 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | InJulia | 1SHOT | true | false | 4 | 20231224_214941__941 | 0 | 0.0 | 36.2942 | 0 | [83, 268] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__InJulia__1SHOT__20231224_214941__941.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 921 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | InJulia | 1SHOT | true | true | 4 | 20231224_215118__499 | 0 | 0.0 | 96.6047 | 0 | [83, 719] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__InJulia__1SHOT__20231224_215118__499.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 922 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | InJulia | 1SHOT | true | false | 4 | 20231226_204956__298 | 0 | 0.0 | 41.0559 | 0 | [83, 302] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__InJulia__1SHOT__20231226_204956__298.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 923 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231213_230617__423 | 0 | 0.0 | 5.68054 | 0 | [112, 154] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231213_230617__423.json | 50.0 | missing | missing | missing | |
| 924 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_214840__705 | 0 | 0.0 | 9.99507 | 0 | [122, 58] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231224_214840__705.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 925 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_214904__521 | 1 | 0.0 | 24.0301 | 1 | [122, 167] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231224_214904__521.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 926 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231226_204914__195 | 1 | 0.0 | 9.22783 | 1 | [122, 52] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231226_204914__195.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 927 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231213_230612__174 | 0 | 0.0 | 7.55765 | 0 | [239, 171] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231213_230612__174.json | 25.0 | missing | missing | missing | |
| 928 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231224_214806__891 | 0 | 0.0 | 53.8569 | 0 | [248, 159] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231224_214806__891.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 929 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231224_214830__407 | 0 | 0.0 | 24.4047 | 0 | [248, 148] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231224_214830__407.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 930 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231226_204905__349 | 0 | 0.0 | 62.1504 | 0 | [248, 242] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231226_204905__349.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 931 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231213_230726__170 | 0 | 0.0 | 19.3507 | 0 | [11, 519] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231213_230726__170.json | 50.0 | missing | missing | missing | |
| 932 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_215516__990 | 0 | 0.0 | 37.8541 | 0 | [396, 221] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231224_215516__990.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 933 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20231224_215614__189 | 0 | 0.0 | 57.4387 | 0 | [396, 365] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231224_215614__189.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 934 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20231226_205220__656 | 0 | 0.0 | 52.9814 | 0 | [396, 333] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231226_205220__656.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 935 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaRecapTask | 1SHOT | false | false | 4 | 20231213_230707__428 | 0 | 0.0 | 18.6963 | 0 | [383, 424] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231213_230707__428.json | 0.0 | missing | missing | missing | |
| 936 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 4 | 20231224_215356__677 | 0 | 0.0 | 53.3473 | 0 | [394, 336] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231224_215356__677.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 937 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_215438__237 | 1 | 0.0 | 41.6787 | 1 | [394, 250] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231224_215438__237.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 938 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 4 | 20231226_205127__551 | 0 | 0.0 | 90.4767 | 0 | [394, 600] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231226_205127__551.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 939 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200148__301 | 5 | 0.000279 | 1.16515 | 2 | [102, 152] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200148__301.json | 100.0 | missing | missing | missing | |
| 940 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | false | 5 | 20240201_200150__674 | 0 | 0.000447 | 1.84113 | 0 | [102, 264] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200150__674.json | 25.0 | missing | missing | missing | |
| 941 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200152__690 | 0 | 0.0004065 | 1.88078 | 0 | [102, 237] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200152__690.json | 50.0 | missing | missing | missing | |
| 942 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200153__588 | 5 | 0.000252 | 1.1149 | 2 | [102, 134] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200153__588.json | 100.0 | missing | missing | missing | |
| 943 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200154__411 | 0 | 0.000243 | 1.22091 | 0 | [102, 128] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200154__411.json | 50.0 | missing | missing | missing | |
| 944 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200142__650 | 0 | 0.0001885 | 0.78373 | 0 | [137, 80] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200142__650.json | 50.0 | missing | missing | missing | |
| 945 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200143__413 | 0 | 0.0001855 | 0.965684 | 0 | [137, 78] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200143__413.json | 50.0 | missing | missing | missing | |
| 946 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200144__910 | 0 | 0.0001975 | 0.764664 | 0 | [137, 86] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200144__910.json | 50.0 | missing | missing | missing | |
| 947 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200145__705 | 0 | 0.0001765 | 0.743374 | 0 | [137, 72] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200145__705.json | 50.0 | missing | missing | missing | |
| 948 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200146__109 | 0 | 0.0002185 | 1.07541 | 0 | [137, 100] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200146__109.json | 50.0 | missing | missing | missing | |
| 949 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_200135__369 | 0 | 0.000386 | 1.46536 | 0 | [283, 163] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200135__369.json | 25.0 | missing | missing | missing | |
| 950 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_200137__479 | 0 | 0.000389 | 1.39283 | 0 | [283, 165] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200137__479.json | 25.0 | missing | missing | missing | |
| 951 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_200138__452 | 0 | 0.000389 | 1.41804 | 0 | [283, 165] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200138__452.json | 25.0 | missing | missing | missing | |
| 952 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200140__382 | 5 | 0.0003845 | 1.50287 | 2 | [283, 162] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200140__382.json | 100.0 | missing | missing | missing | |
| 953 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_200141__690 | 0 | 0.0004025 | 1.42132 | 0 | [283, 174] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200141__690.json | 25.0 | missing | missing | missing | |
| 954 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_200200__868 | 0 | 0.0003705 | 1.00478 | 0 | [360, 127] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200200__868.json | 50.0 | missing | missing | missing | |
| 955 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_200202__859 | 5 | 0.0003735 | 1.0274 | 2 | [360, 129] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200202__859.json | 100.0 | missing | missing | missing | |
| 956 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_200203__380 | 0 | 0.000348 | 1.13287 | 0 | [360, 112] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200203__380.json | 50.0 | missing | missing | missing | |
| 957 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_200204__822 | 5 | 0.0002985 | 0.838736 | 2 | [360, 79] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200204__822.json | 100.0 | missing | missing | missing | |
| 958 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_200205__773 | 0 | 0.000486 | 1.48754 | 1 | [360, 204] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200205__773.json | 62.5 | missing | missing | missing | |
| 959 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200155__378 | 0 | 0.000313 | 1.03343 | 0 | [359, 89] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200155__378.json | 50.0 | missing | missing | missing | |
| 960 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200156__262 | 0 | 0.0002875 | 0.927145 | 0 | [359, 72] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200156__262.json | 50.0 | missing | missing | missing | |
| 961 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200157__492 | 0 | 0.0003355 | 0.950071 | 1 | [359, 104] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200157__492.json | 62.5 | missing | missing | missing | |
| 962 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200158__763 | 5 | 0.000289 | 0.729328 | 2 | [359, 73] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200158__763.json | 100.0 | missing | missing | missing | |
| 963 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200159__890 | 5 | 0.0003535 | 1.00948 | 2 | [359, 116] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200159__890.json | 100.0 | missing | missing | missing | |
| 964 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_070652__527 | 0 | 0.0126 | 26.2073 | 0 | [102, 386] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_070652__527.json | 50.0 | missing | missing | missing | |
| 965 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_070712__276 | 0 | 0.01524 | 20.0335 | 0 | [102, 474] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_070712__276.json | 50.0 | missing | missing | missing | |
| 966 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_070748__865 | 0 | 0.01686 | 35.2471 | 0 | [102, 528] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_070748__865.json | 50.0 | missing | missing | missing | |
| 967 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_070825__463 | 0 | 0.01635 | 37.3726 | 0 | [102, 511] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_070825__463.json | 50.0 | missing | missing | missing | |
| 968 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-0125-preview | InJulia | 1SHOT | true | false | 5 | 20240201_070855__739 | 0 | 0.0135 | 29.9803 | 0 | [102, 416] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_070855__739.json | 25.0 | missing | missing | missing | |
| 969 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_070240__770 | 0 | 0.00491 | 10.6416 | 0 | [137, 118] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_070240__770.json | 50.0 | missing | missing | missing | |
| 970 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_070252__227 | 0 | 0.00404 | 11.0423 | 0 | [137, 89] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_070252__227.json | 50.0 | missing | missing | missing | |
| 971 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_070301__923 | 0 | 0.00464 | 8.75509 | 0 | [137, 109] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_070301__923.json | 50.0 | missing | missing | missing | |
| 972 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_070310__533 | 0 | 0.00425 | 9.28741 | 0 | [137, 96] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_070310__533.json | 50.0 | missing | missing | missing | |
| 973 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_070320__146 | 0 | 0.0047 | 10.007 | 0 | [137, 111] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_070320__146.json | 50.0 | missing | missing | missing | |
| 974 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_065849__125 | 0 | 0.01654 | 61.4966 | 0 | [283, 457] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_065849__125.json | 25.0 | missing | missing | missing | |
| 975 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_065926__740 | 0 | 0.01642 | 37.068 | 0 | [283, 453] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_065926__740.json | 50.0 | missing | missing | missing | |
| 976 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_070005__355 | 0 | 0.01651 | 38.6934 | 0 | [283, 456] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_070005__355.json | 50.0 | missing | missing | missing | |
| 977 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_070047__946 | 0 | 0.0178 | 41.6997 | 0 | [283, 499] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_070047__946.json | 25.0 | missing | missing | missing | |
| 978 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_070126__949 | 0 | 0.01408 | 38.6667 | 0 | [283, 375] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_070126__949.json | 25.0 | missing | missing | missing | |
| 979 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_071839__170 | 0 | 0.0183 | 32.4417 | 0 | [360, 490] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_071839__170.json | 50.0 | missing | missing | missing | |
| 980 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_071916__489 | 0 | 0.01776 | 36.2544 | 0 | [360, 472] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_071916__489.json | 50.0 | missing | missing | missing | |
| 981 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_071957__690 | 0 | 0.01584 | 40.9366 | 0 | [360, 408] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_071957__690.json | 50.0 | missing | missing | missing | |
| 982 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_072048__382 | 0 | 0.01866 | 51.0412 | 0 | [360, 502] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_072048__382.json | 50.0 | missing | missing | missing | |
| 983 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_072122__434 | 0 | 0.01746 | 33.285 | 0 | [360, 462] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_072122__434.json | 50.0 | missing | missing | missing | |
| 984 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_071247__613 | 0 | 0.01841 | 33.5649 | 0 | [359, 494] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_071247__613.json | 50.0 | missing | missing | missing | |
| 985 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_071318__335 | 0 | 0.01595 | 29.6965 | 0 | [359, 412] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_071318__335.json | 50.0 | missing | missing | missing | |
| 986 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_071405__499 | 0 | 0.02063 | 46.8069 | 0 | [359, 568] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_071405__499.json | 50.0 | missing | missing | missing | |
| 987 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_071451__945 | 0 | 0.01598 | 46.3019 | 0 | [359, 413] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_071451__945.json | 50.0 | missing | missing | missing | |
| 988 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_071527__310 | 0 | 0.01703 | 35.457 | 0 | [359, 448] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/audi_filter/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_071527__310.json | 50.0 | missing | missing | missing | |
| 989 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231213_232532__251 | 0 | 0.0 | 17.8388 | 0 | [95, 515] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231213_232532__251.json | 0.0 | missing | missing | missing | |
| 990 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231224_225950__246 | 0 | 0.0 | 6.24579 | 0 | [117, 100] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231224_225950__246.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 991 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231224_230003__797 | 0 | 0.0 | 13.4543 | 0 | [117, 236] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231224_230003__797.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 992 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | InJulia | 1SHOT | true | false | 5 | 20231213_232514__958 | 0 | 0.0 | 13.319 | 0 | [112, 384] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231213_232514__958.json | 25.0 | missing | missing | missing | |
| 993 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231224_225931__253 | 0 | 0.0 | 13.5563 | 0 | [120, 238] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231224_225931__253.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 994 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231224_225944__362 | 0 | 0.0 | 12.4507 | 0 | [120, 217] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231224_225944__362.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 995 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231226_212420__786 | 0 | 0.0 | 15.4032 | 0 | [120, 261] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231226_212420__786.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 996 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231213_232501__569 | 0 | 0.0 | 11.7851 | 0 | [141, 328] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231213_232501__569.json | 0.0 | missing | missing | missing | |
| 997 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_225911__412 | 5 | 0.0 | 13.3111 | 2 | [158, 227] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231224_225911__412.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 998 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_225917__408 | 0 | 0.0 | 5.56346 | 0 | [158, 81] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231224_225917__408.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 999 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_212404__496 | 0 | 0.0 | 13.511 | 0 | [158, 217] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231226_212404__496.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1000 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_232449__119 | 0 | 0.0 | 15.3734 | 0 | [311, 364] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231213_232449__119.json | 0.0 | missing | missing | missing | |
| 1001 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231224_225851__500 | 5 | 0.0 | 24.2962 | 2 | [329, 212] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231224_225851__500.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1002 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_225858__277 | 0 | 0.0 | 6.74498 | 0 | [329, 73] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231224_225858__277.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1003 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_212350__450 | 0 | 0.0 | 21.2614 | 0 | [329, 161] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231226_212350__450.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1004 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_232613__120 | 0 | 0.0 | 20.392 | 0 | [11, 544] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231213_232613__120.json | 50.0 | missing | missing | missing | |
| 1005 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_230133__762 | 0 | 0.0 | 36.4796 | 0 | [423, 579] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231224_230133__762.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1006 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231224_230157__871 | 0 | 0.0 | 22.7769 | 0 | [423, 345] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231224_230157__871.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1007 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_212512__303 | 0 | 0.0 | 16.2021 | 0 | [423, 220] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231226_212512__303.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1008 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_232553__613 | 0 | 0.0 | 20.2837 | 0 | [412, 450] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231213_232553__613.json | 50.0 | missing | missing | missing | |
| 1009 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaRecapTask | 1SHOT | false | false | 5 | 20231224_230032__825 | 0 | 0.0 | 28.5827 | 0 | [420, 446] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231224_230032__825.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1010 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_230057__213 | 0 | 0.0 | 24.8535 | 0 | [420, 382] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231224_230057__213.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1011 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_212456__774 | 0 | 0.0 | 35.5977 | 0 | [420, 545] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231226_212456__774.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1012 | NVIDIA-RTX-4090-4x | audi_filter | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_203615__563 | 0 | 0.0 | 2.72377 | 0 | [0, 209] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_203615__563.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1013 | NVIDIA-RTX-4090-4x | audi_filter | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_203619__261 | 5 | 0.0 | 2.90643 | 2 | [0, 223] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_203619__261.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1014 | NVIDIA-RTX-4090-4x | audi_filter | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_203623__539 | 0 | 0.0 | 4.59761 | 0 | [0, 351] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_203623__539.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1015 | NVIDIA-RTX-4090-4x | audi_filter | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_203627__431 | 0 | 0.0 | 3.69539 | 0 | [0, 283] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_203627__431.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1016 | NVIDIA-RTX-4090-4x | audi_filter | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_203631__622 | 0 | 0.0 | 3.33745 | 0 | [0, 256] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_203631__622.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1017 | NVIDIA-RTX-4090-4x | audi_filter | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_203550__273 | 0 | 0.0 | 1.01423 | 0 | [0, 78] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_203550__273.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1018 | NVIDIA-RTX-4090-4x | audi_filter | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_203551__508 | 0 | 0.0 | 1.20895 | 0 | [0, 93] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_203551__508.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1019 | NVIDIA-RTX-4090-4x | audi_filter | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_203554__158 | 0 | 0.0 | 2.66432 | 0 | [0, 204] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_203554__158.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1020 | NVIDIA-RTX-4090-4x | audi_filter | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240131_203556__823 | 0 | 0.0 | 2.09949 | 0 | [0, 160] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_203556__823.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1021 | NVIDIA-RTX-4090-4x | audi_filter | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_203557__367 | 0 | 0.0 | 1.34061 | 0 | [0, 103] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_203557__367.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1022 | NVIDIA-RTX-4090-4x | audi_filter | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_203533__410 | 0 | 0.0 | 2.98747 | 0 | [0, 225] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_203533__410.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1023 | NVIDIA-RTX-4090-4x | audi_filter | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_203535__414 | 0 | 0.0 | 1.86319 | 0 | [0, 141] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_203535__414.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1024 | NVIDIA-RTX-4090-4x | audi_filter | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_203537__700 | 0 | 0.0 | 1.68965 | 0 | [0, 128] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_203537__700.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1025 | NVIDIA-RTX-4090-4x | audi_filter | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_203538__351 | 0 | 0.0 | 1.13284 | 0 | [0, 86] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_203538__351.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1026 | NVIDIA-RTX-4090-4x | audi_filter | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240131_203540__859 | 0 | 0.0 | 1.83681 | 0 | [0, 139] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_203540__859.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1027 | NVIDIA-RTX-4090-4x | audi_filter | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_203736__676 | 0 | 0.0 | 4.33274 | 0 | [0, 322] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_203736__676.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1028 | NVIDIA-RTX-4090-4x | audi_filter | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_203739__804 | 0 | 0.0 | 3.28948 | 0 | [0, 245] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_203739__804.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1029 | NVIDIA-RTX-4090-4x | audi_filter | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_203743__793 | 0 | 0.0 | 3.56194 | 0 | [0, 265] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_203743__793.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1030 | NVIDIA-RTX-4090-4x | audi_filter | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_203746__773 | 5 | 0.0 | 3.02975 | 2 | [0, 226] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_203746__773.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1031 | NVIDIA-RTX-4090-4x | audi_filter | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_203751__504 | 4 | 0.0 | 4.58923 | 2 | [0, 341] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_203751__504.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1032 | NVIDIA-RTX-4090-4x | audi_filter | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_203701__593 | 0 | 0.0 | 3.10065 | 0 | [0, 231] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_203701__593.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1033 | NVIDIA-RTX-4090-4x | audi_filter | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240131_203703__858 | 0 | 0.0 | 1.77975 | 0 | [0, 133] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_203703__858.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1034 | NVIDIA-RTX-4090-4x | audi_filter | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_203707__937 | 0 | 0.0 | 4.54249 | 0 | [0, 337] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_203707__937.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1035 | NVIDIA-RTX-4090-4x | audi_filter | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20240131_203712__289 | 0 | 0.0 | 4.19428 | 0 | [0, 312] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_203712__289.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1036 | NVIDIA-RTX-4090-4x | audi_filter | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_203715__603 | 0 | 0.0 | 3.82693 | 0 | [0, 285] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_203715__603.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1037 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231213_232706__248 | 0 | 0.0 | 10.3853 | 0 | [95, 303] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__AsIs__1SHOT__20231213_232706__248.json | 0.0 | missing | missing | missing | |
| 1038 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231224_230424__807 | 0 | 0.0 | 60.7245 | 0 | [91, 1069] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__AsIs__1SHOT__20231224_230424__807.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1039 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231224_230510__222 | 0 | 0.0 | 45.685 | 0 | [91, 818] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__AsIs__1SHOT__20231224_230510__222.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1040 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231213_232656__691 | 0 | 0.0 | 12.9591 | 0 | [112, 373] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__InJulia__1SHOT__20231213_232656__691.json | 0.0 | missing | missing | missing | |
| 1041 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231224_230310__987 | 0 | 0.0 | 19.9744 | 0 | [94, 362] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__InJulia__1SHOT__20231224_230310__987.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1042 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231224_230323__346 | 0 | 0.0 | 13.7482 | 0 | [94, 246] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__InJulia__1SHOT__20231224_230323__346.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1043 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231213_232643__163 | 0 | 0.0 | 5.08613 | 0 | [141, 128] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231213_232643__163.json | 0.0 | missing | missing | missing | |
| 1044 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_230243__752 | 0 | 0.0 | 29.3851 | 0 | [95, 533] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231224_230243__752.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1045 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_230250__297 | 0 | 0.0 | 6.651 | 0 | [95, 112] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231224_230250__297.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1046 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_232638__835 | 0 | 0.0 | 24.3532 | 0 | [311, 598] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231213_232638__835.json | 0.0 | missing | missing | missing | |
| 1047 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_230212__242 | 0 | 0.0 | 15.0038 | 0 | [204, 58] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231224_230212__242.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1048 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231213_232753__165 | 0 | 0.0 | 20.4715 | 0 | [11, 546] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231213_232753__165.json | 25.0 | missing | missing | missing | |
| 1049 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231224_230551__638 | 0 | 0.0 | 8.0979 | 0 | [112, 135] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231224_230551__638.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1050 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231224_230553__168 | 0 | 0.0 | 2.29263 | 0 | [112, 24] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231224_230553__168.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1051 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_232732__718 | 0 | 0.0 | 26.2838 | 0 | [412, 600] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231213_232732__718.json | 50.0 | missing | missing | missing | |
| 1052 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231224_230535__473 | 0 | 0.0 | 24.9075 | 0 | [109, 447] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231224_230535__473.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1053 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231224_230543__850 | 0 | 0.0 | 8.02735 | 0 | [109, 134] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231224_230543__850.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1054 | NVIDIA-RTX-4090-4x | audi_filter | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240131_204358__501 | 0 | 0.0 | 7.30222 | 0 | [0, 266] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_204358__501.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1055 | NVIDIA-RTX-4090-4x | audi_filter | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240131_204402__495 | 0 | 0.0 | 4.19123 | 0 | [0, 153] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_204402__495.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1056 | NVIDIA-RTX-4090-4x | audi_filter | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240131_204408__150 | 0 | 0.0 | 5.12587 | 0 | [0, 187] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_204408__150.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1057 | NVIDIA-RTX-4090-4x | audi_filter | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240131_204417__956 | 0 | 0.0 | 9.12629 | 0 | [0, 332] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_204417__956.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1058 | NVIDIA-RTX-4090-4x | audi_filter | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240131_204426__205 | 0 | 0.0 | 8.97398 | 0 | [0, 326] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_204426__205.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1059 | NVIDIA-RTX-4090-4x | audi_filter | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_204252__428 | 0 | 0.0 | 8.8338 | 0 | [0, 321] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_204252__428.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1060 | NVIDIA-RTX-4090-4x | audi_filter | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_204258__735 | 0 | 0.0 | 5.63189 | 0 | [0, 205] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_204258__735.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1061 | NVIDIA-RTX-4090-4x | audi_filter | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_204304__984 | 0 | 0.0 | 6.12712 | 0 | [0, 223] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_204304__984.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1062 | NVIDIA-RTX-4090-4x | audi_filter | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_204307__558 | 0 | 0.0 | 2.13736 | 0 | [0, 78] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_204307__558.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1063 | NVIDIA-RTX-4090-4x | audi_filter | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_204313__624 | 0 | 0.0 | 5.80226 | 0 | [0, 211] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_204313__624.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1064 | NVIDIA-RTX-4090-4x | audi_filter | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_204128__971 | 0 | 0.0 | 7.70842 | 0 | [0, 278] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_204128__971.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1065 | NVIDIA-RTX-4090-4x | audi_filter | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_204138__388 | 0 | 0.0 | 9.72645 | 0 | [0, 350] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_204138__388.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1066 | NVIDIA-RTX-4090-4x | audi_filter | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_204142__955 | 0 | 0.0 | 3.81062 | 0 | [0, 138] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_204142__955.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1067 | NVIDIA-RTX-4090-4x | audi_filter | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240131_204159__995 | 0 | 0.0 | 17.2223 | 0 | [0, 617] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_204159__995.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1068 | NVIDIA-RTX-4090-4x | audi_filter | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_204205__742 | 0 | 0.0 | 5.11024 | 0 | [0, 185] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_204205__742.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1069 | NVIDIA-RTX-4090-4x | audi_filter | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240131_204619__661 | 0 | 0.0 | 4.07781 | 0 | [0, 147] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_204619__661.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1070 | NVIDIA-RTX-4090-4x | audi_filter | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_204626__979 | 0 | 0.0 | 7.31584 | 0 | [0, 263] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_204626__979.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1071 | NVIDIA-RTX-4090-4x | audi_filter | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_204643__603 | 0 | 0.0 | 17.5415 | 0 | [0, 627] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_204643__603.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1072 | NVIDIA-RTX-4090-4x | audi_filter | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240131_204705__564 | 0 | 0.0 | 21.1873 | 0 | [0, 757] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_204705__564.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1073 | NVIDIA-RTX-4090-4x | audi_filter | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240131_204722__105 | 0 | 0.0 | 16.6736 | 0 | [0, 596] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_204722__105.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1074 | NVIDIA-RTX-4090-4x | audi_filter | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240131_204456__386 | 0 | 0.0 | 0.141219 | 0 | [0, 5] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_204456__386.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1075 | NVIDIA-RTX-4090-4x | audi_filter | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_204517__446 | 0 | 0.0 | 21.5479 | 0 | [0, 769] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_204517__446.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1076 | NVIDIA-RTX-4090-4x | audi_filter | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240131_204527__863 | 0 | 0.0 | 9.71798 | 0 | [0, 348] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_204527__863.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1077 | NVIDIA-RTX-4090-4x | audi_filter | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20240131_204533__749 | 0 | 0.0 | 6.12706 | 0 | [0, 220] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_204533__749.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1078 | NVIDIA-RTX-4090-4x | audi_filter | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240131_204543__350 | 0 | 0.0 | 9.3196 | 0 | [0, 334] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_204543__350.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1079 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 5 | 20240131_202640__223 | 0 | 0.0 | 12.4987 | 0 | [0, 310] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240131_202640__223.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1080 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240131_202653__752 | 0 | 0.0 | 12.6794 | 0 | [0, 314] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240131_202653__752.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1081 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 5 | 20240131_202701__961 | 0 | 0.0 | 8.17463 | 1 | [0, 203] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240131_202701__961.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1082 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240131_203206__233 | 0 | 0.0 | 4.27208 | 0 | [126, 98] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240131_203206__233.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1083 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 5 | 20240201_070321__194 | 4 | 0.0 | 55.3396 | 2 | [127, 705] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_070321__194.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1084 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_202524__719 | 0 | 0.0 | 5.63319 | 0 | [0, 140] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_202524__719.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1085 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_202525__336 | 0 | 0.0 | 0.804283 | 0 | [0, 20] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_202525__336.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1086 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_202527__950 | 0 | 0.0 | 2.44952 | 0 | [0, 61] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_202527__950.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1087 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_202535__928 | 0 | 0.0 | 8.42615 | 0 | [0, 209] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_202535__928.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1088 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_202544__341 | 0 | 0.0 | 8.71644 | 0 | [0, 216] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_202544__341.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1089 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_202405__119 | 0 | 0.0 | 7.46197 | 0 | [0, 184] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_202405__119.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1090 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_202412__297 | 0 | 0.0 | 7.41938 | 0 | [0, 183] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_202412__297.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1091 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_202419__705 | 0 | 0.0 | 7.16446 | 0 | [0, 177] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_202419__705.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1092 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_202428__976 | 0 | 0.0 | 8.97356 | 0 | [0, 221] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_202428__976.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1093 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240131_202438__361 | 0 | 0.0 | 9.75678 | 0 | [0, 240] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_202438__361.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1094 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_203449__879 | 0 | 0.0 | 8.04686 | 0 | [0, 197] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240131_203449__879.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1095 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_203458__166 | 0 | 0.0 | 8.86482 | 0 | [0, 217] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240131_203458__166.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1096 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_203505__711 | 0 | 0.0 | 7.01938 | 0 | [0, 172] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240131_203505__711.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1097 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_203510__188 | 0 | 0.0 | 4.84134 | 0 | [0, 119] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240131_203510__188.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1098 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_203513__543 | 0 | 0.0 | 2.96191 | 0 | [0, 73] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240131_203513__543.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1099 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_203328__426 | 0 | 0.0 | 20.3278 | 0 | [0, 495] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240131_203328__426.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1100 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_203340__891 | 0 | 0.0 | 10.8514 | 0 | [0, 265] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240131_203340__891.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1101 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_203349__379 | 1 | 0.0 | 9.32152 | 2 | [0, 228] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240131_203349__379.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1102 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240131_203349__948 | 0 | 0.0 | 0.127308 | 0 | [0, 3] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240131_203349__948.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1103 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240131_203350__863 | 0 | 0.0 | 0.124946 | 0 | [0, 3] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240131_203350__863.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1104 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_201732__549 | 0 | 0.0 | 11.6275 | 0 | [0, 219] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_201732__549.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1105 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240131_201745__811 | 0 | 0.0 | 13.0771 | 0 | [0, 246] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_201745__811.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1106 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240131_201756__854 | 0 | 0.0 | 11.6809 | 0 | [0, 220] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_201756__854.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1107 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240131_201808__908 | 0 | 0.0 | 11.2056 | 0 | [0, 211] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_201808__908.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1108 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240131_201822__385 | 0 | 0.0 | 14.1856 | 0 | [0, 267] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_201822__385.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1109 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_201503__297 | 0 | 0.0 | 24.4839 | 0 | [0, 458] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_201503__297.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1110 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_201518__882 | 0 | 0.0 | 15.8315 | 0 | [0, 297] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_201518__882.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1111 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_201538__596 | 0 | 0.0 | 19.3678 | 0 | [0, 363] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_201538__596.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1112 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_201552__818 | 0 | 0.0 | 14.4019 | 0 | [0, 270] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_201552__818.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1113 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_201612__880 | 0 | 0.0 | 19.5353 | 1 | [0, 366] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_201612__880.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1114 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_201249__284 | 0 | 0.0 | 23.5151 | 0 | [0, 437] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_201249__284.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1115 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_201306__798 | 0 | 0.0 | 16.4838 | 0 | [0, 307] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_201306__798.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1116 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_201313__541 | 0 | 0.0 | 7.22152 | 0 | [0, 135] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_201313__541.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1117 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_201329__772 | 0 | 0.0 | 15.8817 | 0 | [0, 296] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_201329__772.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1118 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_201341__193 | 0 | 0.0 | 11.8391 | 0 | [0, 221] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_201341__193.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1119 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_202156__406 | 0 | 0.0 | 19.2128 | 0 | [0, 356] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_202156__406.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1120 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_202206__478 | 0 | 0.0 | 9.47274 | 0 | [0, 176] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_202206__478.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1121 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_202215__949 | 0 | 0.0 | 9.03366 | 0 | [0, 168] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_202215__949.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1122 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_202236__106 | 5 | 0.0 | 20.8852 | 2 | [0, 387] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_202236__106.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1123 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_202246__880 | 0 | 0.0 | 9.75602 | 0 | [0, 181] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_202246__880.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1124 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_201925__830 | 0 | 0.0 | 13.8355 | 0 | [0, 257] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_201925__830.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1125 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240131_201934__496 | 0 | 0.0 | 8.60318 | 0 | [0, 160] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_201934__496.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1126 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240131_201945__876 | 0 | 0.0 | 11.2939 | 0 | [0, 210] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_201945__876.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1127 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240131_202009__801 | 0 | 0.0 | 23.8653 | 0 | [0, 442] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_202009__801.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1128 | NVIDIA-RTX-4090-4x | audi_filter | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240131_202028__587 | 0 | 0.0 | 18.5723 | 0 | [0, 344] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_202028__587.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1129 | NVIDIA-RTX-4090-4x | audi_filter | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_203849__759 | 0 | 0.0 | 2.64445 | 0 | [0, 322] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_203849__759.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1130 | NVIDIA-RTX-4090-4x | audi_filter | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_203853__445 | 0 | 0.0 | 3.82987 | 0 | [0, 464] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_203853__445.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1131 | NVIDIA-RTX-4090-4x | audi_filter | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_203856__972 | 0 | 0.0 | 3.21498 | 0 | [0, 391] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_203856__972.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1132 | NVIDIA-RTX-4090-4x | audi_filter | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_203858__141 | 4 | 0.0 | 2.1808 | 2 | [0, 266] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_203858__141.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1133 | NVIDIA-RTX-4090-4x | audi_filter | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_203900__644 | 5 | 0.0 | 1.32522 | 2 | [0, 162] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_203900__644.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1134 | NVIDIA-RTX-4090-4x | audi_filter | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_203827__185 | 0 | 0.0 | 1.68861 | 0 | [0, 206] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_203827__185.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1135 | NVIDIA-RTX-4090-4x | audi_filter | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_203828__332 | 0 | 0.0 | 0.810235 | 0 | [0, 99] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_203828__332.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1136 | NVIDIA-RTX-4090-4x | audi_filter | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_203829__491 | 0 | 0.0 | 1.70536 | 0 | [0, 208] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_203829__491.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1137 | NVIDIA-RTX-4090-4x | audi_filter | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_203831__622 | 0 | 0.0 | 1.93935 | 0 | [0, 236] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_203831__622.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1138 | NVIDIA-RTX-4090-4x | audi_filter | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_203833__648 | 0 | 0.0 | 1.43564 | 0 | [0, 175] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_203833__648.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1139 | NVIDIA-RTX-4090-4x | audi_filter | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_203807__273 | 0 | 0.0 | 1.87916 | 0 | [0, 226] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_203807__273.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1140 | NVIDIA-RTX-4090-4x | audi_filter | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_203808__493 | 0 | 0.0 | 1.20811 | 0 | [0, 146] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_203808__493.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1141 | NVIDIA-RTX-4090-4x | audi_filter | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_203811__367 | 0 | 0.0 | 2.56807 | 0 | [0, 308] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_203811__367.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1142 | NVIDIA-RTX-4090-4x | audi_filter | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_203813__552 | 0 | 0.0 | 2.88393 | 0 | [0, 345] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_203813__552.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1143 | NVIDIA-RTX-4090-4x | audi_filter | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_203815__331 | 0 | 0.0 | 1.05214 | 0 | [0, 127] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_203815__331.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1144 | NVIDIA-RTX-4090-4x | audi_filter | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_203955__248 | 0 | 0.0 | 3.35805 | 0 | [0, 397] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_203955__248.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1145 | NVIDIA-RTX-4090-4x | audi_filter | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_204000__551 | 5 | 0.0 | 4.80521 | 2 | [0, 565] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_204000__551.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1146 | NVIDIA-RTX-4090-4x | audi_filter | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_204004__284 | 0 | 0.0 | 3.12011 | 0 | [0, 369] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_204004__284.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1147 | NVIDIA-RTX-4090-4x | audi_filter | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240131_204006__316 | 0 | 0.0 | 2.47006 | 0 | [0, 293] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_204006__316.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1148 | NVIDIA-RTX-4090-4x | audi_filter | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_204010__753 | 0 | 0.0 | 3.12525 | 0 | [0, 370] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_204010__753.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1149 | NVIDIA-RTX-4090-4x | audi_filter | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_203922__800 | 0 | 0.0 | 3.25056 | 1 | [0, 383] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_203922__800.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1150 | NVIDIA-RTX-4090-4x | audi_filter | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_203926__335 | 0 | 0.0 | 3.76553 | 0 | [0, 444] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_203926__335.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1151 | NVIDIA-RTX-4090-4x | audi_filter | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_203929__191 | 0 | 0.0 | 3.17165 | 0 | [0, 375] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_203929__191.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1152 | NVIDIA-RTX-4090-4x | audi_filter | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_203933__472 | 0 | 0.0 | 3.4797 | 0 | [0, 408] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_203933__472.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1153 | NVIDIA-RTX-4090-4x | audi_filter | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240131_203936__488 | 0 | 0.0 | 2.54892 | 0 | [0, 302] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_203936__488.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1154 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_212234__806 | 0 | 0.0 | 18.2853 | 0 | [95, 530] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_212234__806.json | 0.0 | missing | missing | missing | |
| 1155 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_212245__703 | 0 | 0.0 | 10.7895 | 0 | [1, 335] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_212245__703.json | 0.0 | missing | missing | missing | |
| 1156 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_212256__409 | 0 | 0.0 | 10.836 | 0 | [1, 336] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_212256__409.json | 0.0 | missing | missing | missing | |
| 1157 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231224_232851__718 | 0 | 0.0 | 45.2377 | 0 | [115, 265] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231224_232851__718.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1158 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231224_232934__521 | 0 | 0.0 | 42.2124 | 0 | [115, 246] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231224_232934__521.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1159 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_212201__368 | 0 | 0.0 | 20.4052 | 0 | [1, 604] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_212201__368.json | 0.0 | missing | missing | missing | |
| 1160 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_212216__687 | 0 | 0.0 | 14.3885 | 0 | [1, 438] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_212216__687.json | 0.0 | missing | missing | missing | |
| 1161 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231224_232702__977 | 5 | 0.0 | 40.8284 | 2 | [118, 232] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231224_232702__977.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1162 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231224_232806__261 | 0 | 0.0 | 63.7885 | 0 | [118, 379] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231224_232806__261.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1163 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231226_213533__400 | 0 | 0.0 | 45.2163 | 0 | [118, 266] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231226_213533__400.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1164 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_212117__970 | 0 | 0.0 | 8.90231 | 0 | [1, 275] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_212117__970.json | 0.0 | missing | missing | missing | |
| 1165 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_212119__143 | 0 | 0.0 | 2.48083 | 0 | [1, 79] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_212119__143.json | 50.0 | missing | missing | missing | |
| 1166 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_232544__252 | 0 | 0.0 | 52.4488 | 0 | [159, 302] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_232544__252.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1167 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_232620__190 | 2 | 0.0 | 35.3575 | 2 | [159, 196] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_232620__190.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1168 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_213448__251 | 0 | 0.0 | 54.4396 | 0 | [159, 317] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_213448__251.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1169 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_212039__932 | 0 | 0.0 | 20.6687 | 0 | [1, 574] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_212039__932.json | 0.0 | missing | missing | missing | |
| 1170 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231219_212054__361 | 0 | 0.0 | 15.5367 | 0 | [1, 441] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_212054__361.json | 50.0 | missing | missing | missing | |
| 1171 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231224_232411__590 | 5 | 0.0 | 78.5092 | 2 | [334, 259] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_232411__590.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1172 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231224_232452__732 | 0 | 0.0 | 39.4602 | 0 | [334, 187] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_232452__732.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1173 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_213353__826 | 2 | 0.0 | 80.9641 | 2 | [334, 297] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_213353__826.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1174 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_212458__605 | 0 | 0.0 | 16.2798 | 0 | [1, 448] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_212458__605.json | 25.0 | missing | missing | missing | |
| 1175 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_212518__448 | 0 | 0.0 | 20.5976 | 0 | [1, 557] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_212518__448.json | 50.0 | missing | missing | missing | |
| 1176 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_233230__523 | 0 | 0.0 | 45.6835 | 0 | [447, 211] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_233230__523.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1177 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_233257__276 | 0 | 0.0 | 26.0678 | 0 | [447, 92] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_233257__276.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1178 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_213758__264 | 0 | 0.0 | 55.1589 | 0 | [447, 269] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_213758__264.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1179 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_212353__424 | 0 | 0.0 | 36.2872 | 0 | [1, 929] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_212353__424.json | 0.0 | missing | missing | missing | |
| 1180 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_212419__977 | 0 | 0.0 | 26.3818 | 0 | [1, 699] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_212419__977.json | 25.0 | missing | missing | missing | |
| 1181 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_233040__885 | 0 | 0.0 | 66.7025 | 0 | [445, 335] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_233040__885.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1182 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231224_233145__958 | 0 | 0.0 | 63.1867 | 0 | [445, 312] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_233145__958.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1183 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_213702__163 | 0 | 0.0 | 89.187 | 0 | [445, 471] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_213702__163.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1184 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231226_215357__322 | 0 | 0.0 | 7.71196 | 0 | [118, 291] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231226_215357__322.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1185 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_101255__833 | 0 | 0.0 | 7.57254 | 0 | [118, 286] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_101255__833.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1186 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_101303__494 | 0 | 0.0 | 8.00399 | 0 | [118, 302] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_101303__494.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1187 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_101318__509 | 0 | 0.0 | 14.7564 | 0 | [118, 553] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_101318__509.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1188 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_215349__385 | 0 | 0.0 | 7.69335 | 0 | [155, 284] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_215349__385.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1189 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_101235__876 | 0 | 0.0 | 8.44283 | 0 | [155, 313] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_101235__876.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1190 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_101240__238 | 0 | 0.0 | 5.3209 | 0 | [155, 193] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_101240__238.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1191 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_101248__797 | 0 | 0.0 | 7.0956 | 0 | [155, 261] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_101248__797.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1192 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_215341__511 | 0 | 0.0 | 9.86017 | 0 | [324, 203] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_215341__511.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1193 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_101216__707 | 0 | 0.0 | 8.35944 | 0 | [324, 154] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_101216__707.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1194 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_101223__171 | 0 | 0.0 | 7.45848 | 0 | [324, 243] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_101223__171.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1195 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_101226__573 | 0 | 0.0 | 3.39898 | 0 | [324, 90] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_101226__573.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1196 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_215416__965 | 0 | 0.0 | 3.4585 | 0 | [407, 80] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_215416__965.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1197 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_101342__656 | 0 | 0.0 | 10.342 | 0 | [407, 332] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_101342__656.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1198 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_101350__349 | 0 | 0.0 | 8.5885 | 0 | [407, 269] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_101350__349.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1199 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_101401__991 | 0 | 0.0 | 10.189 | 0 | [407, 327] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_101401__991.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1200 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_215412__503 | 0 | 0.0 | 15.3586 | 0 | [404, 508] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_215412__503.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1201 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_101322__823 | 0 | 0.0 | 3.80351 | 0 | [404, 93] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_101322__823.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1202 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_101327__304 | 0 | 0.0 | 5.19129 | 0 | [404, 145] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_101327__304.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1203 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_101331__951 | 0 | 0.0 | 4.44021 | 0 | [404, 117] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_101331__951.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1204 | Apple-MacBook-Pro-M1 | audi_filter | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_104429__273 | 0 | 0.0 | 4.90961 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_104429__273.json | 50.0 | missing | missing | missing | |
| 1205 | Apple-MacBook-Pro-M1 | audi_filter | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_104432__639 | 0 | 0.0 | 2.72952 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_104432__639.json | 50.0 | missing | missing | missing | |
| 1206 | Apple-MacBook-Pro-M1 | audi_filter | gemini-1.0-pro-latest | InJulia | 1SHOT | true | false | 5 | 20240217_104435__243 | 0 | 0.0 | 3.03491 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_104435__243.json | 25.0 | missing | missing | missing | |
| 1207 | Apple-MacBook-Pro-M1 | audi_filter | gemini-1.0-pro-latest | InJulia | 1SHOT | false | false | 5 | 20240217_104442__828 | 0 | 0.0 | 6.60348 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_104442__828.json | 0.0 | missing | missing | missing | |
| 1208 | Apple-MacBook-Pro-M1 | audi_filter | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_104445__366 | 0 | 0.0 | 3.45096 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_104445__366.json | 50.0 | missing | missing | missing | |
| 1209 | Apple-MacBook-Pro-M1 | audi_filter | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_104346__948 | 0 | 0.0 | 2.60597 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_104346__948.json | 50.0 | missing | missing | missing | |
| 1210 | Apple-MacBook-Pro-M1 | audi_filter | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_104349__963 | 0 | 0.0 | 2.65157 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_104349__963.json | 50.0 | missing | missing | missing | |
| 1211 | Apple-MacBook-Pro-M1 | audi_filter | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240217_104352__692 | 0 | 0.0 | 2.46331 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_104352__692.json | 0.0 | missing | missing | missing | |
| 1212 | Apple-MacBook-Pro-M1 | audi_filter | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_104358__187 | 0 | 0.0 | 6.14977 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_104358__187.json | 50.0 | missing | missing | missing | |
| 1213 | Apple-MacBook-Pro-M1 | audi_filter | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_104400__735 | 0 | 0.0 | 2.27973 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_104400__735.json | 50.0 | missing | missing | missing | |
| 1214 | Apple-MacBook-Pro-M1 | audi_filter | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240217_104313__185 | 0 | 0.0 | 2.76386 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_104313__185.json | 25.0 | missing | missing | missing | |
| 1215 | Apple-MacBook-Pro-M1 | audi_filter | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240217_104315__956 | 0 | 0.0 | 2.64535 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_104315__956.json | 0.0 | missing | missing | missing | |
| 1216 | Apple-MacBook-Pro-M1 | audi_filter | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240217_104318__350 | 0 | 0.0 | 3.06115 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_104318__350.json | 50.0 | missing | missing | missing | |
| 1217 | Apple-MacBook-Pro-M1 | audi_filter | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240217_104323__600 | 0 | 0.0 | 4.86 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_104323__600.json | 25.0 | missing | missing | missing | |
| 1218 | Apple-MacBook-Pro-M1 | audi_filter | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240217_104327__349 | 0 | 0.0 | 3.412 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_104327__349.json | 50.0 | missing | missing | missing | |
| 1219 | Apple-MacBook-Pro-M1 | audi_filter | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240217_104601__706 | 0 | 0.0 | 11.8062 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_104601__706.json | 25.0 | missing | missing | missing | |
| 1220 | Apple-MacBook-Pro-M1 | audi_filter | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240217_104607__294 | 0 | 0.0 | 5.56783 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_104607__294.json | 25.0 | missing | missing | missing | |
| 1221 | Apple-MacBook-Pro-M1 | audi_filter | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240217_104612__569 | 0 | 0.0 | 5.24033 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_104612__569.json | 50.0 | missing | missing | missing | |
| 1222 | Apple-MacBook-Pro-M1 | audi_filter | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240217_104616__662 | 0 | 0.0 | 3.78806 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_104616__662.json | 25.0 | missing | missing | missing | |
| 1223 | Apple-MacBook-Pro-M1 | audi_filter | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240217_104619__282 | 0 | 0.0 | 2.36044 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_104619__282.json | 25.0 | missing | missing | missing | |
| 1224 | Apple-MacBook-Pro-M1 | audi_filter | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_104514__507 | 0 | 0.0 | 3.38427 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_104514__507.json | 50.0 | missing | missing | missing | |
| 1225 | Apple-MacBook-Pro-M1 | audi_filter | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_104517__861 | 0 | 0.0 | 3.58572 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_104517__861.json | 50.0 | missing | missing | missing | |
| 1226 | Apple-MacBook-Pro-M1 | audi_filter | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20240217_104520__868 | 0 | 0.0 | 2.89966 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_104520__868.json | 0.0 | missing | missing | missing | |
| 1227 | Apple-MacBook-Pro-M1 | audi_filter | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_104523__410 | 0 | 0.0 | 2.77442 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_104523__410.json | 50.0 | missing | missing | missing | |
| 1228 | Apple-MacBook-Pro-M1 | audi_filter | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20240217_104528__598 | 0 | 0.0 | 4.94404 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_104528__598.json | 0.0 | missing | missing | missing | |
| 1229 | Apple-MacBook-Pro-M1 | audi_filter | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 5 | 20240223_213714__308 | 0 | 0.0 | 9.58648 | 0 | [0, 149] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_213714__308.json | 0.0 | missing | missing | missing | |
| 1230 | Apple-MacBook-Pro-M1 | audi_filter | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | false | 5 | 20240223_213734__443 | 0 | 0.0 | 20.5079 | 0 | [0, 320] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_213734__443.json | 25.0 | missing | missing | missing | |
| 1231 | Apple-MacBook-Pro-M1 | audi_filter | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | false | 5 | 20240223_213741__123 | 0 | 0.0 | 6.79021 | 0 | [0, 106] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_213741__123.json | 25.0 | missing | missing | missing | |
| 1232 | Apple-MacBook-Pro-M1 | audi_filter | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | false | 5 | 20240223_213747__798 | 0 | 0.0 | 5.55968 | 0 | [0, 85] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_213747__798.json | 25.0 | missing | missing | missing | |
| 1233 | Apple-MacBook-Pro-M1 | audi_filter | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 5 | 20240223_213756__561 | 0 | 0.0 | 9.49698 | 0 | [0, 149] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_213756__561.json | 0.0 | missing | missing | missing | |
| 1234 | Apple-MacBook-Pro-M1 | audi_filter | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240223_213554__532 | 0 | 0.0 | 6.26203 | 0 | [0, 98] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_213554__532.json | 0.0 | missing | missing | missing | |
| 1235 | Apple-MacBook-Pro-M1 | audi_filter | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240223_213559__109 | 0 | 0.0 | 5.00868 | 0 | [0, 78] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_213559__109.json | 25.0 | missing | missing | missing | |
| 1236 | Apple-MacBook-Pro-M1 | audi_filter | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240223_213606__315 | 0 | 0.0 | 6.33143 | 0 | [0, 97] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_213606__315.json | 0.0 | missing | missing | missing | |
| 1237 | Apple-MacBook-Pro-M1 | audi_filter | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240223_213610__238 | 0 | 0.0 | 4.18576 | 0 | [0, 65] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_213610__238.json | 50.0 | missing | missing | missing | |
| 1238 | Apple-MacBook-Pro-M1 | audi_filter | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240223_213616__304 | 0 | 0.0 | 5.60173 | 0 | [0, 88] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_213616__304.json | 25.0 | missing | missing | missing | |
| 1239 | Apple-MacBook-Pro-M1 | audi_filter | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240223_213323__521 | 0 | 0.0 | 24.6854 | 0 | [0, 383] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_213323__521.json | 25.0 | missing | missing | missing | |
| 1240 | Apple-MacBook-Pro-M1 | audi_filter | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240223_213353__323 | 0 | 0.0 | 29.8571 | 0 | [0, 461] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_213353__323.json | 25.0 | missing | missing | missing | |
| 1241 | Apple-MacBook-Pro-M1 | audi_filter | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240223_213419__802 | 0 | 0.0 | 26.4214 | 0 | [0, 409] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_213419__802.json | 25.0 | missing | missing | missing | |
| 1242 | Apple-MacBook-Pro-M1 | audi_filter | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240223_213447__142 | 0 | 0.0 | 27.7296 | 0 | [0, 425] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_213447__142.json | 25.0 | missing | missing | missing | |
| 1243 | Apple-MacBook-Pro-M1 | audi_filter | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240223_213518__235 | 0 | 0.0 | 30.6104 | 0 | [0, 472] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_213518__235.json | 25.0 | missing | missing | missing | |
| 1244 | Apple-MacBook-Pro-M1 | audi_filter | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_214417__569 | 0 | 0.0 | 29.5568 | 0 | [0, 453] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_214417__569.json | 50.0 | missing | missing | missing | |
| 1245 | Apple-MacBook-Pro-M1 | audi_filter | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_214439__441 | 0 | 0.0 | 22.3563 | 0 | [0, 344] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_214439__441.json | 50.0 | missing | missing | missing | |
| 1246 | Apple-MacBook-Pro-M1 | audi_filter | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_214503__483 | 0 | 0.0 | 23.8248 | 0 | [0, 367] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_214503__483.json | 50.0 | missing | missing | missing | |
| 1247 | Apple-MacBook-Pro-M1 | audi_filter | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240223_214526__173 | 0 | 0.0 | 22.3271 | 0 | [0, 344] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_214526__173.json | 25.0 | missing | missing | missing | |
| 1248 | Apple-MacBook-Pro-M1 | audi_filter | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_214547__462 | 0 | 0.0 | 20.987 | 0 | [0, 320] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_214547__462.json | 50.0 | missing | missing | missing | |
| 1249 | Apple-MacBook-Pro-M1 | audi_filter | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240223_214022__884 | 0 | 0.0 | 25.353 | 0 | [0, 392] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_214022__884.json | 50.0 | missing | missing | missing | |
| 1250 | Apple-MacBook-Pro-M1 | audi_filter | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240223_214042__691 | 0 | 0.0 | 19.9983 | 0 | [0, 307] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_214042__691.json | 50.0 | missing | missing | missing | |
| 1251 | Apple-MacBook-Pro-M1 | audi_filter | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240223_214103__454 | 0 | 0.0 | 21.2827 | 0 | [0, 327] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_214103__454.json | 50.0 | missing | missing | missing | |
| 1252 | Apple-MacBook-Pro-M1 | audi_filter | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240223_214125__315 | 0 | 0.0 | 21.6735 | 0 | [0, 334] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_214125__315.json | 50.0 | missing | missing | missing | |
| 1253 | Apple-MacBook-Pro-M1 | audi_filter | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240223_214146__508 | 0 | 0.0 | 21.2183 | 0 | [0, 327] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_214146__508.json | 50.0 | missing | missing | missing | |
| 1254 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 5 | 20231213_212104__607 | 0 | 0.000267 | 3.08139 | 0 | [99, 145] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231213_212104__607.json | 0.0 | missing | missing | missing | |
| 1255 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 5 | 20231225_182943__426 | 0 | 0.0002745 | 3.02207 | 0 | [99, 150] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_182943__426.json | 0.0 | missing | missing | missing | |
| 1256 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 5 | 20231225_182947__191 | 0 | 0.000273 | 3.32692 | 0 | [99, 149] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_182947__191.json | 0.0 | missing | missing | missing | |
| 1257 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo--optim | AsIs | 1SHOT | false | false | 5 | 20231215_185744__843 | 0 | 0.0 | 4.03947 | 0 | [99, 188] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231215_185744__843.json | 0.0 | 0.5 | missing | 0.5 | |
| 1258 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231213_212101__351 | 0 | 0.000414 | 5.30573 | 0 | [102, 242] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231213_212101__351.json | 50.0 | missing | missing | missing | |
| 1259 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231225_182933__136 | 0 | 0.0003315 | 3.77652 | 0 | [102, 187] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_182933__136.json | 50.0 | missing | missing | missing | |
| 1260 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231225_182940__518 | 0 | 0.000357 | 6.93182 | 0 | [102, 204] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_182940__518.json | 50.0 | missing | missing | missing | |
| 1261 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231227_190153__813 | 0 | 0.000411 | 5.13503 | 0 | [102, 240] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_190153__813.json | 50.0 | missing | missing | missing | |
| 1262 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231227_190200__939 | 0 | 0.000414 | 6.64588 | 0 | [102, 242] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_190200__939.json | 50.0 | missing | missing | missing | |
| 1263 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo--optim | InJulia | 1SHOT | true | true | 5 | 20231215_185740__786 | 0 | 0.0 | 4.05526 | 0 | [102, 181] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231215_185740__786.json | 50.0 | 0.5 | missing | 0.5 | |
| 1264 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_212055__541 | 0 | 0.000274 | 10.9518 | 0 | [137, 137] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231213_212055__541.json | 50.0 | missing | missing | missing | |
| 1265 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_182924__574 | 0 | 0.000205 | 2.31573 | 0 | [137, 91] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_182924__574.json | 50.0 | missing | missing | missing | |
| 1266 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_182928__852 | 0 | 0.0003115 | 3.50017 | 0 | [137, 162] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_182928__852.json | 50.0 | missing | missing | missing | |
| 1267 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_190146__490 | 0 | 0.000253 | 2.59246 | 0 | [137, 123] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_190146__490.json | 0.0 | missing | missing | missing | |
| 1268 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_190147__616 | 0 | 0.000181 | 1.55225 | 0 | [137, 75] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_190147__616.json | 50.0 | missing | missing | missing | |
| 1269 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_185735__638 | 0 | 0.0 | 1.71229 | 0 | [137, 72] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231215_185735__638.json | 50.0 | 0.5 | missing | 0.5 | |
| 1270 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_212044__873 | 0 | 0.000254 | 3.13163 | 0 | [283, 75] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231213_212044__873.json | 0.0 | missing | missing | missing | |
| 1271 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_182916__622 | 0 | 0.0002675 | 3.20303 | 0 | [283, 84] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_182916__622.json | 0.0 | missing | missing | missing | |
| 1272 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_182921__394 | 0 | 0.0005675 | 5.41252 | 0 | [283, 284] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_182921__394.json | 25.0 | missing | missing | missing | |
| 1273 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_190141__360 | 0 | 0.0002975 | 2.42473 | 0 | [283, 104] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_190141__360.json | 0.0 | missing | missing | missing | |
| 1274 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_190143__836 | 0 | 0.0002495 | 1.87776 | 0 | [283, 72] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_190143__836.json | 0.0 | missing | missing | missing | |
| 1275 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo--optim | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231215_185733__497 | 0 | 0.0 | 2.3306 | 0 | [283, 90] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231215_185733__497.json | 0.0 | 0.5 | missing | 0.5 | |
| 1276 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231213_212109__975 | 0 | 0.0002895 | 2.08166 | 0 | [360, 73] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231213_212109__975.json | 0.0 | missing | missing | missing | |
| 1277 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_182957__698 | 0 | 0.0002475 | 1.05565 | 0 | [360, 45] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_182957__698.json | 0.0 | missing | missing | missing | |
| 1278 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_182959__365 | 0 | 0.0003405 | 1.93441 | 0 | [360, 107] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_182959__365.json | 0.0 | missing | missing | missing | |
| 1279 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_190211__250 | 0 | 0.0002955 | 1.80306 | 0 | [360, 77] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_190211__250.json | 0.0 | missing | missing | missing | |
| 1280 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_190216__730 | 0 | 0.000543 | 4.41467 | 0 | [360, 242] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_190216__730.json | 50.0 | missing | missing | missing | |
| 1281 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_185753__907 | 0 | 0.0 | 6.22044 | 0 | [360, 276] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231215_185753__907.json | 50.0 | 0.5 | missing | 0.5 | |
| 1282 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_212106__718 | 0 | 0.000298 | 2.57063 | 0 | [359, 79] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231213_212106__718.json | 0.0 | missing | missing | missing | |
| 1283 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_182948__748 | 0 | 0.0002755 | 1.32541 | 0 | [359, 64] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_182948__748.json | 0.0 | missing | missing | missing | |
| 1284 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_182955__616 | 0 | 0.000853 | 7.1671 | 0 | [359, 449] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_182955__616.json | 50.0 | missing | missing | missing | |
| 1285 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_190203__119 | 0 | 0.0003595 | 2.20193 | 0 | [359, 120] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_190203__119.json | 50.0 | missing | missing | missing | |
| 1286 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_190209__188 | 0 | 0.0006325 | 6.20811 | 0 | [359, 302] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_190209__188.json | 50.0 | missing | missing | missing | |
| 1287 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo--optim | JuliaRecapTask | 1SHOT | false | false | 5 | 20231215_185746__145 | 0 | 0.0 | 2.20811 | 0 | [359, 86] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231215_185746__145.json | 0.0 | 0.5 | missing | 0.5 | |
| 1288 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 5 | 20231213_212124__125 | 0 | 0.000673 | 7.05695 | 0 | [99, 287] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231213_212124__125.json | 0.0 | missing | missing | missing | |
| 1289 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 5 | 20231225_183017__179 | 0 | 0.000339 | 2.28213 | 0 | [99, 120] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_183017__179.json | 0.0 | missing | missing | missing | |
| 1290 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 5 | 20231225_183020__382 | 0 | 0.000515 | 3.17902 | 0 | [99, 208] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_183020__382.json | 0.0 | missing | missing | missing | |
| 1291 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106--optim | AsIs | 1SHOT | false | false | 5 | 20231215_185806__455 | 0 | 0.0 | 2.77465 | 0 | [99, 120] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231215_185806__455.json | 0.0 | 0.9 | missing | 0.1 | |
| 1292 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231213_212117__254 | 0 | 0.000358 | 2.34821 | 0 | [102, 128] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231213_212117__254.json | 50.0 | missing | missing | missing | |
| 1293 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231225_183012__442 | 0 | 0.000496 | 3.10397 | 1 | [102, 197] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_183012__442.json | 62.5 | missing | missing | missing | |
| 1294 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231225_183014__218 | 0 | 0.000368 | 2.13694 | 0 | [102, 133] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_183014__218.json | 50.0 | missing | missing | missing | |
| 1295 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231227_190231__235 | 5 | 0.000416 | 2.57036 | 2 | [102, 157] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_190231__235.json | 100.0 | missing | missing | missing | |
| 1296 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231227_190233__674 | 0 | 0.00035 | 2.13272 | 0 | [102, 124] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_190233__674.json | 50.0 | missing | missing | missing | |
| 1297 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106--optim | InJulia | 1SHOT | true | true | 5 | 20231215_185803__125 | 0 | 0.0 | 4.38067 | 0 | [102, 151] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231215_185803__125.json | 50.0 | 0.9 | missing | 0.1 | |
| 1298 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_212114__921 | 5 | 0.000287 | 1.51931 | 2 | [137, 75] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231213_212114__921.json | 100.0 | missing | missing | missing | |
| 1299 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_183007__892 | 0 | 0.000289 | 2.12531 | 0 | [137, 76] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_183007__892.json | 50.0 | missing | missing | missing | |
| 1300 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_183009__310 | 0 | 0.000311 | 1.59889 | 1 | [137, 87] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_183009__310.json | 62.5 | missing | missing | missing | |
| 1301 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_190226__243 | 5 | 0.000313 | 1.76799 | 2 | [137, 88] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_190226__243.json | 100.0 | missing | missing | missing | |
| 1302 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_190228__882 | 5 | 0.000283 | 1.78794 | 2 | [137, 73] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_190228__882.json | 100.0 | missing | missing | missing | |
| 1303 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_185758__690 | 0 | 0.0 | 1.65097 | 0 | [137, 73] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231215_185758__690.json | 50.0 | 0.9 | missing | 0.1 | |
| 1304 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231213_212112__544 | 0 | 0.000613 | 3.45057 | 0 | [283, 165] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231213_212112__544.json | 25.0 | missing | missing | missing | |
| 1305 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_183002__348 | 0 | 0.000609 | 2.8648 | 0 | [283, 163] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_183002__348.json | 25.0 | missing | missing | missing | |
| 1306 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_183004__985 | 0 | 0.000419 | 1.77916 | 0 | [283, 68] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_183004__985.json | 50.0 | missing | missing | missing | |
| 1307 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_190220__628 | 5 | 0.000611 | 3.72188 | 2 | [283, 164] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_190220__628.json | 100.0 | missing | missing | missing | |
| 1308 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_190224__900 | 0 | 0.000609 | 3.43086 | 0 | [283, 163] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_190224__900.json | 25.0 | missing | missing | missing | |
| 1309 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106--optim | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231215_185756__200 | 0 | 0.0 | 3.15196 | 0 | [283, 163] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231215_185756__200.json | 25.0 | 0.9 | missing | 0.1 | |
| 1310 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_212131__271 | 5 | 0.000784 | 4.04549 | 2 | [360, 212] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231213_212131__271.json | 100.0 | missing | missing | missing | |
| 1311 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_183028__285 | 0 | 0.000522 | 1.81756 | 0 | [360, 81] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_183028__285.json | 0.0 | missing | missing | missing | |
| 1312 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_183029__181 | 0 | 0.000514 | 1.52301 | 0 | [360, 77] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_183029__181.json | 0.0 | missing | missing | missing | |
| 1313 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_190240__729 | 0 | 0.000528 | 1.64321 | 0 | [360, 84] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_190240__729.json | 0.0 | missing | missing | missing | |
| 1314 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_190242__526 | 0 | 0.000544 | 2.12249 | 0 | [360, 92] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_190242__526.json | 0.0 | missing | missing | missing | |
| 1315 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106--optim | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231215_185812__197 | 0 | 0.0 | 3.81648 | 0 | [360, 163] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231215_185812__197.json | 0.0 | 0.9 | missing | 0.1 | |
| 1316 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_212126__297 | 5 | 0.000549 | 1.78464 | 2 | [359, 95] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231213_212126__297.json | 100.0 | missing | missing | missing | |
| 1317 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_183022__517 | 0 | 0.000567 | 2.09974 | 0 | [359, 104] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_183022__517.json | 50.0 | missing | missing | missing | |
| 1318 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_183025__742 | 5 | 0.000627 | 2.36159 | 2 | [359, 134] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_183025__742.json | 100.0 | missing | missing | missing | |
| 1319 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_190236__552 | 0 | 0.000599 | 2.70633 | 0 | [359, 120] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_190236__552.json | 50.0 | missing | missing | missing | |
| 1320 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_190238__699 | 5 | 0.000509 | 1.56532 | 2 | [359, 75] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_190238__699.json | 100.0 | missing | missing | missing | |
| 1321 | Apple-MacBook-Pro-M1 | audi_filter | gpt-3.5-turbo-1106--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_185808__546 | 0 | 0.0 | 2.36302 | 1 | [359, 87] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231215_185808__546.json | 62.5 | 0.9 | missing | 0.1 | |
| 1322 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 5 | 20231213_212258__246 | 0 | 0.00933 | 23.5667 | 0 | [99, 278] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231213_212258__246.json | 0.0 | missing | missing | missing | |
| 1323 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 5 | 20231225_183225__164 | 0 | 0.01047 | 15.2344 | 0 | [99, 316] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_183225__164.json | 0.0 | missing | missing | missing | |
| 1324 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 5 | 20231225_183244__922 | 0 | 0.0093 | 19.0537 | 0 | [99, 277] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_183244__922.json | 0.0 | missing | missing | missing | |
| 1325 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview--optim | AsIs | 1SHOT | false | false | 5 | 20231215_185956__221 | 0 | 0.0 | 26.4645 | 0 | [99, 276] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231215_185956__221.json | 0.0 | 0.1 | missing | 0.9 | |
| 1326 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231213_212235__902 | 0 | 0.0102 | 23.2644 | 0 | [102, 306] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231213_212235__902.json | 50.0 | missing | missing | missing | |
| 1327 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231225_183141__448 | 0 | 0.00993 | 10.9262 | 0 | [102, 297] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_183141__448.json | 50.0 | missing | missing | missing | |
| 1328 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231225_183209__116 | 0 | 0.01584 | 28.0333 | 0 | [102, 494] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_183209__116.json | 50.0 | missing | missing | missing | |
| 1329 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231227_190428__731 | 0 | 0.01299 | 27.8619 | 0 | [102, 399] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_190428__731.json | 50.0 | missing | missing | missing | |
| 1330 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231227_190455__823 | 0 | 0.01071 | 26.4308 | 0 | [102, 323] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_190455__823.json | 50.0 | missing | missing | missing | |
| 1331 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview--optim | InJulia | 1SHOT | true | true | 5 | 20231215_185929__700 | 0 | 0.0 | 22.0677 | 0 | [102, 299] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231215_185929__700.json | 50.0 | 0.1 | missing | 0.9 | |
| 1332 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_212211__321 | 0 | 0.00737 | 15.7328 | 0 | [137, 200] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231213_212211__321.json | 50.0 | missing | missing | missing | |
| 1333 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_183120__209 | 5 | 0.00602 | 10.2582 | 2 | [137, 155] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_183120__209.json | 100.0 | missing | missing | missing | |
| 1334 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_183129__284 | 0 | 0.00752 | 9.19301 | 0 | [137, 205] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_183129__284.json | 50.0 | missing | missing | missing | |
| 1335 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_190343__377 | 5 | 0.00629 | 10.7885 | 2 | [137, 164] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_190343__377.json | 100.0 | missing | missing | missing | |
| 1336 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_190400__831 | 0 | 0.00917 | 16.7527 | 0 | [137, 260] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_190400__831.json | 50.0 | missing | missing | missing | |
| 1337 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_185907__217 | 0 | 0.0 | 13.5537 | 0 | [137, 195] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231215_185907__217.json | 50.0 | 0.1 | missing | 0.9 | |
| 1338 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_212155__927 | 5 | 0.01429 | 24.139 | 2 | [283, 382] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231213_212155__927.json | 100.0 | missing | missing | missing | |
| 1339 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_183053__623 | 0 | 0.01519 | 24.0606 | 0 | [283, 412] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_183053__623.json | 25.0 | missing | missing | missing | |
| 1340 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_183109__380 | 0 | 0.01255 | 15.2246 | 0 | [283, 324] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_183109__380.json | 50.0 | missing | missing | missing | |
| 1341 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_190302__271 | 5 | 0.01303 | 19.9719 | 2 | [283, 340] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_190302__271.json | 100.0 | missing | missing | missing | |
| 1342 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_190332__401 | 0 | 0.0151 | 29.2127 | 0 | [283, 409] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_190332__401.json | 25.0 | missing | missing | missing | |
| 1343 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview--optim | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231215_185853__548 | 0 | 0.0 | 40.4382 | 0 | [283, 347] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231215_185853__548.json | 25.0 | 0.1 | missing | 0.9 | |
| 1344 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_212349__890 | 0 | 0.01305 | 20.9855 | 0 | [360, 315] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231213_212349__890.json | 50.0 | missing | missing | missing | |
| 1345 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_183407__439 | 5 | 0.01473 | 22.8898 | 2 | [360, 371] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_183407__439.json | 100.0 | missing | missing | missing | |
| 1346 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_183427__887 | 0 | 0.01509 | 19.7307 | 0 | [360, 383] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_183427__887.json | 50.0 | missing | missing | missing | |
| 1347 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_190636__885 | 0 | 0.01302 | 20.79 | 0 | [360, 314] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_190636__885.json | 50.0 | missing | missing | missing | |
| 1348 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_190654__444 | 0 | 0.01212 | 17.7371 | 0 | [360, 284] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_190654__444.json | 50.0 | missing | missing | missing | |
| 1349 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_190056__852 | 0 | 0.0 | 27.9292 | 0 | [360, 379] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231215_190056__852.json | 50.0 | 0.1 | missing | 0.9 | |
| 1350 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_212327__385 | 0 | 0.01436 | 29.1392 | 0 | [359, 359] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231213_212327__385.json | 50.0 | missing | missing | missing | |
| 1351 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_183330__448 | 0 | 0.02048 | 45.9156 | 0 | [359, 563] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_183330__448.json | 50.0 | missing | missing | missing | |
| 1352 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_183343__794 | 0 | 0.01334 | 12.7691 | 0 | [359, 325] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_183343__794.json | 50.0 | missing | missing | missing | |
| 1353 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_190541__440 | 0 | 0.01673 | 45.6252 | 0 | [359, 438] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_190541__440.json | 50.0 | missing | missing | missing | |
| 1354 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_190615__787 | 0 | 0.02117 | 33.3192 | 0 | [359, 586] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_190615__787.json | 50.0 | missing | missing | missing | |
| 1355 | Apple-MacBook-Pro-M1 | audi_filter | gpt-4-1106-preview--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_190028__485 | 0 | 0.0 | 31.5022 | 0 | [359, 374] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231215_190028__485.json | 50.0 | 0.1 | missing | 0.9 | |
| 1356 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | AsIs | 1SHOT | false | false | 5 | 20231213_231817__268 | 0 | 0.0 | 17.5628 | 0 | [95, 508] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__AsIs__1SHOT__20231213_231817__268.json | 0.0 | missing | missing | missing | |
| 1357 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | AsIs | 1SHOT | false | false | 5 | 20231224_224131__802 | 0 | 0.0 | 19.2917 | 0 | [95, 556] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__AsIs__1SHOT__20231224_224131__802.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1358 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | AsIs | 1SHOT | false | false | 5 | 20231224_224142__467 | 0 | 0.0 | 10.9679 | 0 | [1, 339] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__AsIs__1SHOT__20231224_224142__467.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1359 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | InJulia | 1SHOT | true | false | 5 | 20231213_231800__199 | 0 | 0.0 | 14.3483 | 0 | [112, 413] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__InJulia__1SHOT__20231213_231800__199.json | 25.0 | missing | missing | missing | |
| 1360 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | InJulia | 1SHOT | false | false | 5 | 20231224_224047__802 | 0 | 0.0 | 24.8359 | 0 | [112, 698] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__InJulia__1SHOT__20231224_224047__802.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1361 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | InJulia | 1SHOT | false | false | 5 | 20231224_224112__809 | 0 | 0.0 | 24.7241 | 0 | [1, 716] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__InJulia__1SHOT__20231224_224112__809.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1362 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | InJulia | 1SHOT | false | false | 5 | 20231226_211603__901 | 0 | 0.0 | 12.3288 | 0 | [112, 361] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__InJulia__1SHOT__20231226_211603__901.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1363 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231213_231745__966 | 0 | 0.0 | 14.0635 | 0 | [141, 393] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaExpertAsk__1SHOT__20231213_231745__966.json | 0.0 | missing | missing | missing | |
| 1364 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_224015__818 | 0 | 0.0 | 11.0828 | 0 | [141, 308] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaExpertAsk__1SHOT__20231224_224015__818.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1365 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_224022__531 | 0 | 0.0 | 6.81989 | 0 | [1, 211] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaExpertAsk__1SHOT__20231224_224022__531.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1366 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_211551__635 | 0 | 0.0 | 17.1247 | 0 | [141, 486] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaExpertAsk__1SHOT__20231226_211551__635.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1367 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231213_231731__185 | 0 | 0.0 | 23.1192 | 0 | [311, 567] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231213_231731__185.json | 25.0 | missing | missing | missing | |
| 1368 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_223947__859 | 0 | 0.0 | 28.0687 | 0 | [329, 545] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231224_223947__859.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1369 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_224004__559 | 0 | 0.0 | 17.1974 | 0 | [1, 483] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231224_224004__559.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1370 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_211534__533 | 0 | 0.0 | 15.8703 | 0 | [329, 240] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_211534__533.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1371 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231213_231858__331 | 0 | 0.0 | 18.5963 | 0 | [11, 500] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231213_231858__331.json | 25.0 | missing | missing | missing | |
| 1372 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_224237__697 | 0 | 0.0 | 17.3283 | 0 | [11, 468] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231224_224237__697.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1373 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_224303__188 | 3 | 0.0 | 25.7189 | 2 | [1, 680] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231224_224303__188.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1374 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_211658__605 | 0 | 0.0 | 21.9796 | 0 | [11, 593] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_211658__605.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1375 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_231839__652 | 0 | 0.0 | 21.487 | 0 | [412, 480] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaRecapTask__1SHOT__20231213_231839__652.json | 0.0 | missing | missing | missing | |
| 1376 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231224_224205__630 | 0 | 0.0 | 22.6624 | 0 | [412, 511] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaRecapTask__1SHOT__20231224_224205__630.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1377 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231224_224220__662 | 0 | 0.0 | 14.9063 | 0 | [1, 412] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaRecapTask__1SHOT__20231224_224220__662.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1378 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_211636__533 | 0 | 0.0 | 32.2092 | 0 | [412, 754] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaRecapTask__1SHOT__20231226_211636__533.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1379 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | AsIs | 1SHOT | false | false | 5 | 20231213_232911__590 | 0 | 0.0 | 20.3852 | 0 | [95, 585] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__AsIs__1SHOT__20231213_232911__590.json | 0.0 | missing | missing | missing | |
| 1380 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | AsIs | 1SHOT | false | false | 5 | 20231224_230700__492 | 0 | 0.0 | 8.89538 | 0 | [109, 283] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__AsIs__1SHOT__20231224_230700__492.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1381 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | AsIs | 1SHOT | false | false | 5 | 20231224_230707__598 | 0 | 0.0 | 6.58132 | 0 | [109, 206] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__AsIs__1SHOT__20231224_230707__598.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1382 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | InJulia | 1SHOT | false | false | 5 | 20231213_232850__881 | 0 | 0.0 | 24.6754 | 0 | [112, 694] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__InJulia__1SHOT__20231213_232850__881.json | 0.0 | missing | missing | missing | |
| 1383 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | InJulia | 1SHOT | true | true | 5 | 20231224_230642__886 | 0 | 0.0 | 9.19222 | 1 | [112, 293] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__InJulia__1SHOT__20231224_230642__886.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1384 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | InJulia | 1SHOT | true | true | 5 | 20231224_230651__560 | 0 | 0.0 | 8.54393 | 0 | [112, 270] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__InJulia__1SHOT__20231224_230651__560.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1385 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | InJulia | 1SHOT | true | true | 5 | 20231226_212545__133 | 0 | 0.0 | 6.49587 | 0 | [112, 203] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__InJulia__1SHOT__20231226_212545__133.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1386 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_232826__278 | 0 | 0.0 | 16.8748 | 0 | [141, 471] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231213_232826__278.json | 50.0 | missing | missing | missing | |
| 1387 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231224_230627__531 | 0 | 0.0 | 8.29796 | 0 | [151, 258] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231224_230627__531.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1388 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_230633__374 | 0 | 0.0 | 5.77436 | 0 | [151, 173] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231224_230633__374.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1389 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_212538__236 | 0 | 0.0 | 9.45083 | 0 | [151, 296] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231226_212538__236.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1390 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_232809__845 | 0 | 0.0 | 15.818 | 0 | [311, 376] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231213_232809__845.json | 0.0 | missing | missing | missing | |
| 1391 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231224_230609__993 | 0 | 0.0 | 15.8305 | 0 | [321, 275] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231224_230609__993.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1392 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_230619__293 | 0 | 0.0 | 9.58397 | 0 | [321, 266] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231224_230619__293.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1393 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_212528__901 | 0 | 0.0 | 15.7137 | 0 | [321, 278] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231226_212528__901.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1394 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231213_233007__616 | 0 | 0.0 | 26.9342 | 0 | [11, 704] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231213_233007__616.json | 25.0 | missing | missing | missing | |
| 1395 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_230741__371 | 0 | 0.0 | 9.64082 | 0 | [415, 254] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231224_230741__371.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1396 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231224_230752__484 | 0 | 0.0 | 10.9637 | 0 | [415, 295] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231224_230752__484.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1397 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_212604__890 | 0 | 0.0 | 10.6755 | 0 | [415, 285] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231226_212604__890.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1398 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_232941__632 | 0 | 0.0 | 29.5601 | 0 | [412, 680] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaRecapTask__1SHOT__20231213_232941__632.json | 50.0 | missing | missing | missing | |
| 1399 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaRecapTask | 1SHOT | true | false | 5 | 20231224_230719__699 | 0 | 0.0 | 11.8408 | 0 | [412, 321] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaRecapTask__1SHOT__20231224_230719__699.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1400 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_230730__136 | 0 | 0.0 | 11.6385 | 0 | [412, 318] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaRecapTask__1SHOT__20231224_230730__136.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1401 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_212553__387 | 0 | 0.0 | 8.28417 | 0 | [412, 212] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaRecapTask__1SHOT__20231226_212553__387.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1402 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_174702__416 | 0 | 0.0 | 11.2576 | 0 | [112, 211] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_174702__416.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1403 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_174714__635 | 0 | 0.0 | 11.0988 | 0 | [112, 208] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_174714__635.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1404 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_174727__264 | 0 | 0.0 | 12.809 | 0 | [112, 238] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_174727__264.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1405 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_174621__733 | 4 | 0.0 | 11.5945 | 2 | [151, 213] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_174621__733.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1406 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_174638__972 | 0 | 0.0 | 16.5895 | 0 | [151, 313] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_174638__972.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1407 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_174651__317 | 0 | 0.0 | 12.6341 | 1 | [151, 233] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_174651__317.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1408 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_174529__689 | 0 | 0.0 | 18.5315 | 0 | [321, 325] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_174529__689.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1409 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_174551__387 | 0 | 0.0 | 22.3068 | 1 | [321, 396] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_174551__387.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1410 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_174609__581 | 0 | 0.0 | 16.8155 | 0 | [321, 292] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_174609__581.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1411 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_174835__337 | 4 | 0.0 | 18.2052 | 2 | [415, 308] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_174835__337.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1412 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_174857__233 | 0 | 0.0 | 21.4252 | 0 | [415, 353] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_174857__233.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1413 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_174912__613 | 0 | 0.0 | 14.6523 | 0 | [415, 241] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_174912__613.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1414 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_174748__762 | 0 | 0.0 | 20.5169 | 0 | [412, 352] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_174748__762.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1415 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_174758__672 | 0 | 0.0 | 8.77152 | 0 | [412, 127] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_174758__672.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1416 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_174817__479 | 0 | 0.0 | 19.4392 | 0 | [412, 333] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_174817__479.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1417 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | AsIs | 1SHOT | false | false | 5 | 20231213_212659__576 | 0 | 0.00308804 | 21.6282 | 0 | [107, 346] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__AsIs__1SHOT__20231213_212659__576.json | 0.0 | missing | missing | missing | |
| 1418 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | AsIs | 1SHOT | false | false | 5 | 20231225_183829__815 | 0 | 0.00235185 | 6.09738 | 0 | [107, 255] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__AsIs__1SHOT__20231225_183829__815.json | 0.0 | missing | missing | missing | |
| 1419 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | AsIs | 1SHOT | false | false | 5 | 20231225_183839__676 | 0 | 0.00397794 | 10.1774 | 0 | [107, 456] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__AsIs__1SHOT__20231225_183839__676.json | 0.0 | missing | missing | missing | |
| 1420 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium--optim | AsIs | 1SHOT | false | false | 5 | 20231215_190415__550 | 0 | 0.0 | 40.3143 | 0 | [107, 477] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__AsIs__1SHOT__20231215_190415__550.json | 0.0 | 0.9 | missing | 0.3 | |
| 1421 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | InJulia | 1SHOT | true | false | 5 | 20231213_212637__854 | 0 | 0.00236804 | 23.2279 | 0 | [110, 256] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__InJulia__1SHOT__20231213_212637__854.json | 25.0 | missing | missing | missing | |
| 1422 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231225_183800__856 | 0 | 0.0019959 | 8.2914 | 0 | [110, 210] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__InJulia__1SHOT__20231225_183800__856.json | 50.0 | missing | missing | missing | |
| 1423 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231225_183823__815 | 0 | 0.00277254 | 22.4759 | 0 | [110, 306] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__InJulia__1SHOT__20231225_183823__815.json | 50.0 | missing | missing | missing | |
| 1424 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | InJulia | 1SHOT | true | false | 5 | 20231227_191021__598 | 0 | 0.00266737 | 11.5312 | 0 | [110, 293] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__InJulia__1SHOT__20231227_191021__598.json | 25.0 | missing | missing | missing | |
| 1425 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231227_191031__324 | 0 | 0.0025622 | 9.27629 | 0 | [110, 280] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__InJulia__1SHOT__20231227_191031__324.json | 50.0 | missing | missing | missing | |
| 1426 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium--optim | InJulia | 1SHOT | true | true | 5 | 20231215_190334__452 | 0 | 0.0 | 25.0894 | 0 | [110, 299] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__InJulia__1SHOT__20231215_190334__452.json | 50.0 | 0.9 | missing | 0.3 | |
| 1427 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231213_212614__283 | 0 | 0.00187468 | 16.133 | 0 | [149, 182] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231213_212614__283.json | 25.0 | missing | missing | missing | |
| 1428 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_183739__164 | 0 | 0.00295874 | 7.14681 | 0 | [149, 316] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_183739__164.json | 25.0 | missing | missing | missing | |
| 1429 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_183752__939 | 0 | 0.00252997 | 13.0339 | 0 | [149, 263] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_183752__939.json | 25.0 | missing | missing | missing | |
| 1430 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_190951__845 | 0 | 0.00201221 | 17.2813 | 0 | [149, 199] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_190951__845.json | 25.0 | missing | missing | missing | |
| 1431 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_191010__439 | 0 | 0.00231963 | 18.8515 | 0 | [149, 237] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_191010__439.json | 25.0 | missing | missing | missing | |
| 1432 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_190309__163 | 0 | 0.0 | 28.5438 | 0 | [149, 343] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231215_190309__163.json | 50.0 | 0.9 | missing | 0.3 | |
| 1433 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_212558__895 | 0 | 0.00569103 | 51.5785 | 0 | [319, 597] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231213_212558__895.json | 50.0 | missing | missing | missing | |
| 1434 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_183706__587 | 0 | 0.00622497 | 30.889 | 0 | [319, 663] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_183706__587.json | 50.0 | missing | missing | missing | |
| 1435 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_183732__798 | 0 | 0.00475259 | 24.5435 | 0 | [319, 481] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_183732__798.json | 50.0 | missing | missing | missing | |
| 1436 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_190901__355 | 0 | 0.00376561 | 14.0584 | 0 | [319, 359] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_190901__355.json | 25.0 | missing | missing | missing | |
| 1437 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_190933__621 | 0 | 0.0053108 | 32.1245 | 0 | [319, 550] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_190933__621.json | 50.0 | missing | missing | missing | |
| 1438 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_190241__334 | 0 | 0.0 | 51.9823 | 0 | [319, 474] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231215_190241__334.json | 50.0 | 0.9 | missing | 0.3 | |
| 1439 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_212841__777 | 0 | 0.00598258 | 54.358 | 0 | [412, 602] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231213_212841__777.json | 50.0 | missing | missing | missing | |
| 1440 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_183928__674 | 0 | 0.00631427 | 22.5043 | 0 | [412, 643] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_183928__674.json | 50.0 | missing | missing | missing | |
| 1441 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_183940__736 | 0 | 0.00547291 | 12.3795 | 0 | [412, 539] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_183940__736.json | 25.0 | missing | missing | missing | |
| 1442 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_191206__666 | 0 | 0.0053192 | 33.9664 | 0 | [412, 520] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_191206__666.json | 50.0 | missing | missing | missing | |
| 1443 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_191230__602 | 0 | 0.00391963 | 23.5977 | 0 | [412, 347] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_191230__602.json | 25.0 | missing | missing | missing | |
| 1444 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_190609__778 | 0 | 0.0 | 69.6863 | 0 | [412, 826] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231215_190609__778.json | 50.0 | 0.9 | missing | 0.3 | |
| 1445 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_212746__245 | 0 | 0.00552144 | 47.0555 | 0 | [409, 546] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231213_212746__245.json | 50.0 | missing | missing | missing | |
| 1446 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_183853__501 | 5 | 0.00598257 | 13.8966 | 2 | [409, 603] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_183853__501.json | 100.0 | missing | missing | missing | |
| 1447 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_183905__142 | 5 | 0.00438884 | 11.4223 | 2 | [409, 406] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_183905__142.json | 100.0 | missing | missing | missing | |
| 1448 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_191101__838 | 0 | 0.00527874 | 30.8602 | 0 | [409, 516] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_191101__838.json | 25.0 | missing | missing | missing | |
| 1449 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_191132__194 | 0 | 0.00527065 | 30.7681 | 0 | [409, 515] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_191132__194.json | 50.0 | missing | missing | missing | |
| 1450 | Apple-MacBook-Pro-M1 | audi_filter | mistral-medium--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_190459__861 | 0 | 0.0 | 44.1435 | 0 | [409, 546] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231215_190459__861.json | 50.0 | 0.9 | missing | 0.3 | |
| 1451 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | AsIs | 1SHOT | false | false | 5 | 20231213_212451__364 | 0 | 0.000520603 | 3.24475 | 0 | [109, 232] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__AsIs__1SHOT__20231213_212451__364.json | 0.0 | missing | missing | missing | |
| 1452 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | AsIs | 1SHOT | false | false | 5 | 20231225_183545__626 | 0 | 0.000572983 | 3.5438 | 0 | [109, 259] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__AsIs__1SHOT__20231225_183545__626.json | 0.0 | missing | missing | missing | |
| 1453 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | AsIs | 1SHOT | false | false | 5 | 20231225_183551__606 | 0 | 0.000854283 | 5.40047 | 0 | [109, 404] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__AsIs__1SHOT__20231225_183551__606.json | 0.0 | missing | missing | missing | |
| 1454 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small--optim | AsIs | 1SHOT | false | false | 5 | 20231215_190135__631 | 0 | 0.0 | 4.80402 | 0 | [109, 367] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__AsIs__1SHOT__20231215_190135__631.json | 0.0 | 0.9 | missing | 0.3 | |
| 1455 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231213_212448__465 | 0 | 0.000924124 | 5.99354 | 0 | [112, 439] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__InJulia__1SHOT__20231213_212448__465.json | 50.0 | missing | missing | missing | |
| 1456 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231225_183536__492 | 0 | 0.00125392 | 8.19106 | 0 | [112, 609] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__InJulia__1SHOT__20231225_183536__492.json | 50.0 | missing | missing | missing | |
| 1457 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231225_183542__277 | 0 | 0.000790264 | 5.08968 | 0 | [112, 370] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__InJulia__1SHOT__20231225_183542__277.json | 50.0 | missing | missing | missing | |
| 1458 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231227_190804__640 | 0 | 0.000790264 | 4.9794 | 0 | [112, 370] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__InJulia__1SHOT__20231227_190804__640.json | 50.0 | missing | missing | missing | |
| 1459 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231227_190809__586 | 0 | 0.000759224 | 4.79861 | 0 | [112, 354] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__InJulia__1SHOT__20231227_190809__586.json | 50.0 | missing | missing | missing | |
| 1460 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small--optim | InJulia | 1SHOT | true | true | 5 | 20231215_190131__650 | 0 | 0.0 | 5.49055 | 0 | [112, 411] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__InJulia__1SHOT__20231215_190131__650.json | 50.0 | 0.9 | missing | 0.3 | |
| 1461 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_212442__735 | 0 | 0.000483111 | 2.71532 | 0 | [153, 198] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231213_212442__735.json | 50.0 | missing | missing | missing | |
| 1462 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_183525__891 | 0 | 0.000496691 | 2.84076 | 0 | [153, 205] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_183525__891.json | 50.0 | missing | missing | missing | |
| 1463 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_183528__475 | 0 | 0.000446251 | 2.60964 | 0 | [153, 179] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_183528__475.json | 50.0 | missing | missing | missing | |
| 1464 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_190756__906 | 0 | 0.000477291 | 2.88328 | 0 | [153, 195] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_190756__906.json | 50.0 | missing | missing | missing | |
| 1465 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_190759__417 | 0 | 0.000481171 | 2.77205 | 0 | [153, 197] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_190759__417.json | 50.0 | missing | missing | missing | |
| 1466 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_190125__825 | 0 | 0.0 | 2.73271 | 0 | [153, 199] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231215_190125__825.json | 50.0 | 0.9 | missing | 0.3 | |
| 1467 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_212439__493 | 0 | 0.00115441 | 6.48779 | 0 | [333, 484] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231213_212439__493.json | 50.0 | missing | missing | missing | |
| 1468 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_183516__735 | 0 | 0.000395871 | 1.44245 | 0 | [333, 93] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_183516__735.json | 0.0 | missing | missing | missing | |
| 1469 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_183522__726 | 0 | 0.00105741 | 6.05911 | 0 | [333, 434] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_183522__726.json | 50.0 | missing | missing | missing | |
| 1470 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_190744__241 | 0 | 0.00137557 | 8.31873 | 0 | [333, 598] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_190744__241.json | 50.0 | missing | missing | missing | |
| 1471 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_190753__242 | 0 | 0.00122037 | 8.23652 | 0 | [333, 518] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_190753__242.json | 50.0 | missing | missing | missing | |
| 1472 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small--optim | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231215_190122__826 | 0 | 0.0 | 7.72923 | 0 | [333, 577] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231215_190122__826.json | 0.0 | 0.9 | missing | 0.3 | |
| 1473 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_212506__177 | 0 | 0.00120617 | 6.62242 | 0 | [419, 482] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231213_212506__177.json | 50.0 | missing | missing | missing | |
| 1474 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_183621__621 | 0 | 0.00165043 | 14.8681 | 0 | [419, 711] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_183621__621.json | 50.0 | missing | missing | missing | |
| 1475 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_183635__823 | 0 | 0.000957853 | 14.4649 | 1 | [419, 354] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_183635__823.json | 62.5 | missing | missing | missing | |
| 1476 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_190841__184 | 0 | 0.00124497 | 6.84306 | 0 | [419, 502] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_190841__184.json | 25.0 | missing | missing | missing | |
| 1477 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_190847__696 | 5 | 0.00111693 | 6.10465 | 2 | [419, 436] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_190847__696.json | 100.0 | missing | missing | missing | |
| 1478 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_190149__222 | 0 | 0.0 | 7.06384 | 0 | [419, 526] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231215_190149__222.json | 50.0 | 0.9 | missing | 0.3 | |
| 1479 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | JuliaRecapTask | 1SHOT | true | false | 5 | 20231213_212459__816 | 0 | 0.00144156 | 8.32753 | 0 | [417, 604] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231213_212459__816.json | 25.0 | missing | missing | missing | |
| 1480 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_183559__517 | 0 | 0.00144932 | 8.25561 | 0 | [417, 608] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_183559__517.json | 50.0 | missing | missing | missing | |
| 1481 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_183606__842 | 0 | 0.00122428 | 6.67891 | 0 | [417, 492] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_183606__842.json | 50.0 | missing | missing | missing | |
| 1482 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_190821__627 | 0 | 0.00203714 | 12.3355 | 0 | [417, 911] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_190821__627.json | 50.0 | missing | missing | missing | |
| 1483 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_190834__899 | 0 | 0.00139888 | 12.8627 | 0 | [417, 582] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_190834__899.json | 50.0 | missing | missing | missing | |
| 1484 | Apple-MacBook-Pro-M1 | audi_filter | mistral-small--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_190141__719 | 0 | 0.0 | 5.90634 | 1 | [417, 440] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231215_190141__719.json | 62.5 | 0.9 | missing | 0.3 | |
| 1485 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231213_212418__478 | 0 | 0.000145271 | 4.23151 | 0 | [109, 287] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__AsIs__1SHOT__20231213_212418__478.json | 0.0 | missing | missing | missing | |
| 1486 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231225_183454__828 | 0 | 0.000118091 | 2.06854 | 0 | [109, 227] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__AsIs__1SHOT__20231225_183454__828.json | 0.0 | missing | missing | missing | |
| 1487 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231225_183458__301 | 0 | 0.000179699 | 3.32565 | 0 | [109, 363] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__AsIs__1SHOT__20231225_183458__301.json | 0.0 | missing | missing | missing | |
| 1488 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny--optim | AsIs | 1SHOT | false | false | 5 | 20231215_190109__392 | 0 | 0.0 | 1.75361 | 0 | [109, 204] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__AsIs__1SHOT__20231215_190109__392.json | 0.0 | 0.9 | missing | 0.3 | |
| 1489 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231213_212414__529 | 0 | 0.00028295 | 10.1847 | 0 | [112, 590] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__InJulia__1SHOT__20231213_212414__529.json | 50.0 | missing | missing | missing | |
| 1490 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | InJulia | 1SHOT | false | false | 5 | 20231225_183448__192 | 0 | 0.000219077 | 4.10639 | 0 | [112, 449] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__InJulia__1SHOT__20231225_183448__192.json | 0.0 | missing | missing | missing | |
| 1491 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231225_183452__628 | 0 | 0.000203222 | 3.69748 | 0 | [112, 414] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__InJulia__1SHOT__20231225_183452__628.json | 50.0 | missing | missing | missing | |
| 1492 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231227_190714__996 | 0 | 0.000239462 | 4.46137 | 0 | [112, 494] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__InJulia__1SHOT__20231227_190714__996.json | 50.0 | missing | missing | missing | |
| 1493 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | InJulia | 1SHOT | false | false | 5 | 20231227_190717__924 | 0 | 0.000153392 | 2.82955 | 0 | [112, 304] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__InJulia__1SHOT__20231227_190717__924.json | 0.0 | missing | missing | missing | |
| 1494 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny--optim | InJulia | 1SHOT | true | true | 5 | 20231215_190107__636 | 0 | 0.0 | 3.54999 | 0 | [112, 425] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__InJulia__1SHOT__20231215_190107__636.json | 50.0 | 0.9 | missing | 0.3 | |
| 1495 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_212404__342 | 0 | 6.4002e-5 | 1.76376 | 0 | [153, 94] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231213_212404__342.json | 50.0 | missing | missing | missing | |
| 1496 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_183443__878 | 0 | 6.1737e-5 | 0.886266 | 0 | [153, 89] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_183443__878.json | 0.0 | missing | missing | missing | |
| 1497 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_183444__759 | 0 | 6.4002e-5 | 0.998094 | 0 | [153, 94] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_183444__759.json | 0.0 | missing | missing | missing | |
| 1498 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_190708__991 | 0 | 6.5814e-5 | 1.06041 | 0 | [153, 98] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_190708__991.json | 0.0 | missing | missing | missing | |
| 1499 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_190709__149 | 0 | 7.2156e-5 | 1.18554 | 0 | [153, 112] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_190709__149.json | 0.0 | missing | missing | missing | |
| 1500 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_190103__417 | 0 | 0.0 | 0.867842 | 0 | [153, 89] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231215_190103__417.json | 50.0 | 0.9 | missing | 0.3 | |
| 1501 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_212402__251 | 5 | 0.00035919 | 12.42 | 2 | [333, 690] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231213_212402__251.json | 100.0 | missing | missing | missing | |
| 1502 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_183437__338 | 0 | 0.000289428 | 10.0203 | 0 | [333, 536] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_183437__338.json | 50.0 | missing | missing | missing | |
| 1503 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_183442__911 | 0 | 0.000275385 | 4.71929 | 0 | [333, 505] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_183442__911.json | 0.0 | missing | missing | missing | |
| 1504 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_190703__510 | 0 | 0.000238692 | 8.64345 | 0 | [333, 424] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_190703__510.json | 50.0 | missing | missing | missing | |
| 1505 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_190707__588 | 0 | 0.000233709 | 3.84702 | 0 | [333, 413] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_190707__588.json | 0.0 | missing | missing | missing | |
| 1506 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_190103__857 | 0 | 0.0 | 5.95209 | 0 | [333, 341] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231215_190103__857.json | 50.0 | 0.9 | missing | 0.3 | |
| 1507 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_212432__937 | 0 | 0.000211774 | 5.86387 | 0 | [419, 338] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231213_212432__937.json | 50.0 | missing | missing | missing | |
| 1508 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_183512__709 | 0 | 0.000268852 | 4.33207 | 0 | [419, 464] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_183512__709.json | 0.0 | missing | missing | missing | |
| 1509 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_183515__928 | 0 | 0.00019456 | 2.83086 | 0 | [419, 300] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_183515__928.json | 0.0 | missing | missing | missing | |
| 1510 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_190732__364 | 0 | 0.000241672 | 3.67015 | 0 | [419, 404] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_190732__364.json | 0.0 | missing | missing | missing | |
| 1511 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_190736__104 | 0 | 0.000245749 | 3.67145 | 0 | [419, 413] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_190736__104.json | 50.0 | missing | missing | missing | |
| 1512 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny--optim | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231215_190114__109 | 0 | 0.0 | 2.58939 | 0 | [419, 304] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231215_190114__109.json | 0.0 | 0.9 | missing | 0.3 | |
| 1513 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_212426__370 | 0 | 0.000335163 | 7.38953 | 0 | [417, 611] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231213_212426__370.json | 50.0 | missing | missing | missing | |
| 1514 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_183502__872 | 0 | 0.000273102 | 4.5156 | 0 | [417, 474] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_183502__872.json | 50.0 | missing | missing | missing | |
| 1515 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_183508__437 | 0 | 0.000296205 | 5.03273 | 0 | [417, 525] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_183508__437.json | 25.0 | missing | missing | missing | |
| 1516 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_190722__299 | 0 | 0.00030753 | 5.21585 | 0 | [417, 550] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_190722__299.json | 0.0 | missing | missing | missing | |
| 1517 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_190728__960 | 0 | 0.000316137 | 5.43037 | 0 | [417, 569] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_190728__960.json | 50.0 | missing | missing | missing | |
| 1518 | Apple-MacBook-Pro-M1 | audi_filter | mistral-tiny--optim | JuliaRecapTask | 1SHOT | false | false | 5 | 20231215_190112__820 | 0 | 0.0 | 2.69594 | 0 | [417, 313] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231215_190112__820.json | 0.0 | 0.9 | missing | 0.3 | |
| 1519 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_213344__593 | 0 | 0.0 | 16.2187 | 0 | [95, 473] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_213344__593.json | 0.0 | missing | missing | missing | |
| 1520 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_213358__984 | 0 | 0.0 | 13.8025 | 0 | [1, 422] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_213358__984.json | 0.0 | missing | missing | missing | |
| 1521 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_213407__136 | 0 | 0.0 | 9.28015 | 0 | [1, 290] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_213407__136.json | 0.0 | missing | missing | missing | |
| 1522 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231224_233731__595 | 0 | 0.0 | 11.2689 | 0 | [108, 276] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231224_233731__595.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1523 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231224_233744__830 | 0 | 0.0 | 12.7961 | 0 | [108, 316] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231224_233744__830.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1524 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_213311__618 | 0 | 0.0 | 16.5616 | 0 | [1, 499] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_213311__618.json | 0.0 | missing | missing | missing | |
| 1525 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_213328__160 | 0 | 0.0 | 16.282 | 0 | [1, 491] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_213328__160.json | 25.0 | missing | missing | missing | |
| 1526 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231224_233708__463 | 0 | 0.0 | 3.67477 | 0 | [111, 79] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231224_233708__463.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1527 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231224_233720__806 | 0 | 0.0 | 11.4593 | 0 | [111, 281] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231224_233720__806.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1528 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231226_214023__652 | 0 | 0.0 | 8.5989 | 0 | [111, 207] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231226_214023__652.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1529 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_213227__461 | 0 | 0.0 | 17.2195 | 0 | [1, 511] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_213227__461.json | 25.0 | missing | missing | missing | |
| 1530 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_213235__831 | 0 | 0.0 | 8.55477 | 0 | [1, 265] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_213235__831.json | 0.0 | missing | missing | missing | |
| 1531 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_233700__276 | 0 | 0.0 | 3.54085 | 0 | [152, 71] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_233700__276.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1532 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_233704__668 | 0 | 0.0 | 4.04586 | 0 | [152, 84] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_233704__668.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1533 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_214015__467 | 0 | 0.0 | 6.7531 | 0 | [152, 154] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_214015__467.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1534 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_213131__968 | 0 | 0.0 | 28.7351 | 0 | [1, 774] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_213131__968.json | 0.0 | missing | missing | missing | |
| 1535 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_213154__282 | 0 | 0.0 | 22.4363 | 0 | [1, 619] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_213154__282.json | 0.0 | missing | missing | missing | |
| 1536 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231224_233641__328 | 0 | 0.0 | 16.933 | 0 | [332, 244] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_233641__328.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1537 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_233657__267 | 0 | 0.0 | 16.1602 | 0 | [332, 362] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_233657__267.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1538 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_214008__365 | 0 | 0.0 | 30.5404 | 0 | [332, 582] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_214008__365.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1539 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_213610__868 | 0 | 0.0 | 19.1421 | 0 | [1, 521] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_213610__868.json | 50.0 | missing | missing | missing | |
| 1540 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_213630__338 | 0 | 0.0 | 20.1993 | 0 | [1, 547] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_213630__338.json | 0.0 | missing | missing | missing | |
| 1541 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_233826__783 | 0 | 0.0 | 15.4193 | 0 | [419, 329] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_233826__783.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1542 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231224_233842__547 | 0 | 0.0 | 15.9489 | 0 | [419, 342] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_233842__547.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1543 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_214126__873 | 0 | 0.0 | 28.7342 | 0 | [419, 649] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_214126__873.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1544 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_213508__315 | 0 | 0.0 | 28.6565 | 0 | [1, 753] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_213508__315.json | 0.0 | missing | missing | missing | |
| 1545 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_213532__982 | 0 | 0.0 | 23.2481 | 0 | [1, 623] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_213532__982.json | 0.0 | missing | missing | missing | |
| 1546 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231224_233752__628 | 0 | 0.0 | 8.05931 | 0 | [417, 146] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_233752__628.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1547 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231224_233811__441 | 0 | 0.0 | 18.4726 | 0 | [417, 404] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_233811__441.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1548 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_214057__899 | 0 | 0.0 | 33.4894 | 0 | [417, 760] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_214057__899.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1549 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | false | false | 5 | 20231227_223421__490 | 0 | 0.0 | 13.7961 | 0 | [110, 431] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_223421__490.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1550 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_223432__207 | 0 | 0.0 | 10.8264 | 0 | [110, 336] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_223432__207.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1551 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_223443__532 | 0 | 0.0 | 10.209 | 0 | [110, 316] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_223443__532.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1552 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_223453__642 | 0 | 0.0 | 10.5213 | 0 | [110, 326] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_223453__642.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1553 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_223502__365 | 0 | 0.0 | 7.57016 | 0 | [110, 230] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_223502__365.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1554 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_223352__241 | 0 | 0.0 | 4.00805 | 0 | [151, 108] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_223352__241.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1555 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_223355__144 | 0 | 0.0 | 3.25964 | 0 | [151, 84] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_223355__144.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1556 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_223359__315 | 0 | 0.0 | 3.53285 | 0 | [151, 93] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_223359__315.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1557 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_223403__366 | 0 | 0.0 | 3.93564 | 0 | [151, 106] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_223403__366.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1558 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_223407__371 | 0 | 0.0 | 4.05486 | 0 | [151, 110] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_223407__371.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1559 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_223252__281 | 0 | 0.0 | 12.9609 | 0 | [331, 286] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_223252__281.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1560 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_223307__939 | 0 | 0.0 | 14.5606 | 0 | [331, 410] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_223307__939.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1561 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_223320__446 | 0 | 0.0 | 13.0876 | 0 | [331, 365] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_223320__446.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1562 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_223337__363 | 5 | 0.0 | 17.4109 | 2 | [331, 497] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_223337__363.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1563 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_223347__839 | 0 | 0.0 | 9.32001 | 0 | [331, 247] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_223347__839.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1564 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_223644__253 | 0 | 0.0 | 11.4823 | 0 | [418, 298] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_223644__253.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1565 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_223657__511 | 0 | 0.0 | 13.4346 | 0 | [418, 358] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_223657__511.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1566 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_223711__312 | 0 | 0.0 | 13.5921 | 0 | [418, 363] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_223711__312.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1567 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_223727__734 | 0 | 0.0 | 15.8653 | 0 | [418, 432] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_223727__734.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1568 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_223740__743 | 0 | 0.0 | 12.3264 | 0 | [418, 324] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_223740__743.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1569 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_223528__818 | 0 | 0.0 | 26.524 | 0 | [416, 749] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_223528__818.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1570 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_223541__855 | 0 | 0.0 | 12.9143 | 0 | [416, 347] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_223541__855.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1571 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_223606__575 | 0 | 0.0 | 24.1432 | 0 | [416, 679] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_223606__575.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1572 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_223620__729 | 0 | 0.0 | 14.061 | 0 | [416, 381] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_223620__729.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1573 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_223632__421 | 5 | 0.0 | 11.3679 | 2 | [416, 299] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_223632__421.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1574 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_224037__955 | 0 | 0.0 | 15.1425 | 0 | [110, 373] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_224037__955.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1575 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231227_224053__403 | 0 | 0.0 | 14.5193 | 0 | [110, 357] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_224053__403.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1576 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231227_224104__372 | 0 | 0.0 | 11.4335 | 0 | [110, 279] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_224104__372.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1577 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231227_224118__722 | 0 | 0.0 | 14.4321 | 0 | [110, 355] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_224118__722.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1578 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231227_224134__742 | 0 | 0.0 | 15.7617 | 0 | [110, 389] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_224134__742.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1579 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_223948__417 | 0 | 0.0 | 9.81254 | 0 | [151, 232] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_223948__417.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1580 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_223954__786 | 0 | 0.0 | 5.5404 | 0 | [151, 122] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_223954__786.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1581 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_224011__465 | 0 | 0.0 | 16.6735 | 0 | [151, 406] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_224011__465.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1582 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_224017__659 | 0 | 0.0 | 5.26489 | 0 | [151, 115] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_224017__659.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1583 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_224022__596 | 0 | 0.0 | 5.02228 | 0 | [151, 109] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_224022__596.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1584 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_223803__117 | 0 | 0.0 | 22.3678 | 0 | [331, 492] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_223803__117.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1585 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_223830__117 | 0 | 0.0 | 26.934 | 0 | [331, 622] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_223830__117.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1586 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_223848__998 | 0 | 0.0 | 18.4523 | 0 | [331, 417] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_223848__998.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1587 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_223909__726 | 0 | 0.0 | 20.5014 | 0 | [331, 467] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_223909__726.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1588 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_223938__552 | 0 | 0.0 | 28.7054 | 0 | [331, 664] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_223938__552.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1589 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_224434__413 | 0 | 0.0 | 23.0681 | 0 | [418, 512] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_224434__413.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1590 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_224459__908 | 0 | 0.0 | 24.5778 | 0 | [418, 548] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_224459__908.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1591 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_224521__975 | 0 | 0.0 | 22.2318 | 0 | [418, 492] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_224521__975.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1592 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_224550__878 | 0 | 0.0 | 28.7574 | 0 | [418, 647] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_224550__878.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1593 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_224605__506 | 0 | 0.0 | 15.1288 | 0 | [418, 320] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_224605__506.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1594 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_224211__746 | 0 | 0.0 | 37.1038 | 0 | [416, 844] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_224211__746.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1595 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_224233__756 | 0 | 0.0 | 21.1316 | 0 | [416, 469] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_224233__756.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1596 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_224302__425 | 0 | 0.0 | 28.5696 | 0 | [416, 646] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_224302__425.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1597 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_224333__291 | 0 | 0.0 | 30.8144 | 0 | [416, 699] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_224333__291.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1598 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_224411__415 | 0 | 0.0 | 38.0831 | 0 | [416, 867] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_224411__415.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1599 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231226_115916__369 | 0 | 0.0 | 15.0164 | 0 | [107, 261] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_115916__369.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1600 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231226_115932__278 | 0 | 0.0 | 15.7436 | 0 | [107, 272] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_115932__278.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1601 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231226_115844__738 | 0 | 0.0 | 15.756 | 0 | [110, 279] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_115844__738.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1602 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231226_115901__349 | 0 | 0.0 | 17.576 | 0 | [110, 307] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_115901__349.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1603 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_215211__430 | 0 | 0.0 | 34.1386 | 0 | [110, 622] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_215211__430.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1604 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_115822__128 | 0 | 0.0 | 5.62752 | 0 | [151, 88] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_115822__128.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1605 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_115828__798 | 0 | 0.0 | 6.03432 | 0 | [151, 95] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_115828__798.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1606 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_215137__391 | 0 | 0.0 | 5.56676 | 0 | [151, 89] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_215137__391.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1607 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_115746__605 | 0 | 0.0 | 50.2133 | 0 | [331, 855] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_115746__605.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1608 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_115816__822 | 0 | 0.0 | 29.3349 | 0 | [331, 496] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_115816__822.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1609 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_215132__980 | 0 | 0.0 | 38.8799 | 0 | [331, 514] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_215132__980.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1610 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_120102__131 | 0 | 0.0 | 18.2393 | 0 | [418, 288] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_120102__131.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1611 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_120131__262 | 0 | 0.0 | 28.9466 | 0 | [418, 480] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_120131__262.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1612 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_215332__847 | 0 | 0.0 | 42.0864 | 0 | [418, 715] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_215332__847.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1613 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_120006__941 | 0 | 0.0 | 33.8602 | 0 | [416, 558] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_120006__941.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1614 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_120044__870 | 0 | 0.0 | 37.3807 | 0 | [416, 631] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_120044__870.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1615 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_215249__376 | 0 | 0.0 | 37.2972 | 0 | [416, 633] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_215249__376.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1616 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_102046__388 | 0 | 0.0 | 39.7457 | 1 | [116, 226] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_102046__388.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1617 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_102132__798 | 0 | 0.0 | 45.4951 | 0 | [116, 261] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_102132__798.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1618 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_102204__523 | 5 | 0.0 | 31.4011 | 2 | [116, 175] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_102204__523.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1619 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_142103__624 | 0 | 0.0 | 56.3527 | 0 | [116, 325] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_142103__624.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1620 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_142156__424 | 0 | 0.0 | 52.529 | 0 | [116, 302] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_142156__424.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1621 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_101845__468 | 0 | 0.0 | 14.1809 | 0 | [155, 64] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_101845__468.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1622 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_101937__788 | 0 | 0.0 | 52.5525 | 0 | [155, 298] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_101937__788.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1623 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_102006__235 | 0 | 0.0 | 28.3406 | 0 | [155, 151] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_102006__235.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1624 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_141947__162 | 0 | 0.0 | 19.3516 | 0 | [155, 95] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_141947__162.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1625 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_142007__698 | 0 | 0.0 | 20.4496 | 0 | [155, 102] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_142007__698.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1626 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_101623__802 | 0 | 0.0 | 142.62 | 0 | [343, 760] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_101623__802.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1627 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_101708__241 | 0 | 0.0 | 43.5989 | 0 | [343, 211] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_101708__241.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1628 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_101830__125 | 0 | 0.0 | 82.4544 | 0 | [343, 440] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_101830__125.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1629 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_141740__494 | 0 | 0.0 | 136.569 | 0 | [343, 746] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_141740__494.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1630 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_141927__896 | 0 | 0.0 | 106.221 | 0 | [343, 574] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_141927__896.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1631 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_102758__758 | 5 | 0.0 | 62.2628 | 2 | [429, 298] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_102758__758.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1632 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_102849__192 | 0 | 0.0 | 50.6756 | 0 | [429, 232] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_102849__192.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1633 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_102917__791 | 0 | 0.0 | 27.7508 | 0 | [429, 97] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_102917__791.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1634 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_142501__399 | 4 | 0.0 | 60.2849 | 2 | [429, 289] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_142501__399.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1635 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_142631__296 | 0 | 0.0 | 89.4277 | 0 | [429, 458] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_142631__296.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1636 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_102346__872 | 0 | 0.0 | 101.644 | 0 | [427, 531] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_102346__872.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1637 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_102500__546 | 0 | 0.0 | 73.5209 | 0 | [427, 369] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_102500__546.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1638 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_102654__250 | 0 | 0.0 | 114.123 | 0 | [427, 576] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_102654__250.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1639 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_142248__206 | 0 | 0.0 | 51.1402 | 0 | [427, 236] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_142248__206.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1640 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_142400__452 | 5 | 0.0 | 71.5277 | 2 | [427, 355] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_142400__452.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1641 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_213913__339 | 0 | 0.0 | 15.282 | 0 | [95, 446] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231219_213913__339.json | 0.0 | missing | missing | missing | |
| 1642 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_213929__531 | 0 | 0.0 | 16.0426 | 0 | [1, 485] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231219_213929__531.json | 0.0 | missing | missing | missing | |
| 1643 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_213941__627 | 0 | 0.0 | 11.6902 | 0 | [1, 361] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231219_213941__627.json | 0.0 | missing | missing | missing | |
| 1644 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231224_234009__843 | 0 | 0.0 | 11.4924 | 0 | [116, 281] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231224_234009__843.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1645 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231224_234019__577 | 0 | 0.0 | 10.148 | 0 | [116, 247] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231224_234019__577.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1646 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_213840__726 | 0 | 0.0 | 22.8524 | 0 | [1, 669] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_213840__726.json | 25.0 | missing | missing | missing | |
| 1647 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231219_213858__640 | 0 | 0.0 | 18.3056 | 0 | [1, 547] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_213858__640.json | 50.0 | missing | missing | missing | |
| 1648 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231224_233946__782 | 0 | 0.0 | 10.7987 | 0 | [119, 263] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231224_233946__782.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1649 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231224_233957__126 | 0 | 0.0 | 10.4082 | 0 | [119, 253] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231224_233957__126.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1650 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231226_214205__557 | 0 | 0.0 | 9.74859 | 0 | [119, 236] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231226_214205__557.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1651 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_213759__420 | 0 | 0.0 | 13.4379 | 0 | [1, 406] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_213759__420.json | 50.0 | missing | missing | missing | |
| 1652 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_213803__115 | 0 | 0.0 | 4.49214 | 0 | [1, 142] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_213803__115.json | 0.0 | missing | missing | missing | |
| 1653 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_233928__721 | 0 | 0.0 | 4.70385 | 0 | [160, 101] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_233928__721.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1654 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_233935__166 | 0 | 0.0 | 6.90787 | 0 | [160, 158] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_233935__166.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1655 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_214156__580 | 0 | 0.0 | 4.73735 | 0 | [160, 102] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_214156__580.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1656 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_213723__588 | 0 | 0.0 | 34.3122 | 0 | [1, 906] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_213723__588.json | 0.0 | missing | missing | missing | |
| 1657 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_213737__829 | 0 | 0.0 | 13.7169 | 0 | [1, 392] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_213737__829.json | 0.0 | missing | missing | missing | |
| 1658 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231224_233904__193 | 0 | 0.0 | 22.0211 | 0 | [340, 349] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_233904__193.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1659 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231224_233923__539 | 4 | 0.0 | 18.3041 | 2 | [340, 414] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_233923__539.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1660 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_214151__747 | 0 | 0.0 | 24.9021 | 0 | [340, 430] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_214151__747.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1661 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_214140__839 | 0 | 0.0 | 24.5033 | 0 | [1, 653] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_214140__839.json | 50.0 | missing | missing | missing | |
| 1662 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_214147__964 | 0 | 0.0 | 7.21136 | 0 | [1, 206] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_214147__964.json | 0.0 | missing | missing | missing | |
| 1663 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231224_234106__311 | 0 | 0.0 | 14.1238 | 0 | [427, 296] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_234106__311.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1664 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_234123__290 | 0 | 0.0 | 16.4729 | 0 | [427, 354] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_234123__290.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1665 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231226_214244__622 | 0 | 0.0 | 19.976 | 0 | [427, 438] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_214244__622.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1666 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_214023__424 | 0 | 0.0 | 19.5782 | 0 | [1, 532] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_214023__424.json | 50.0 | missing | missing | missing | |
| 1667 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_214052__228 | 0 | 0.0 | 29.2956 | 0 | [1, 768] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_214052__228.json | 0.0 | missing | missing | missing | |
| 1668 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_234037__734 | 0 | 0.0 | 17.5782 | 0 | [425, 381] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_234037__734.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1669 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_234052__285 | 0 | 0.0 | 14.3603 | 0 | [425, 302] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_234052__285.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1670 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_214224__913 | 0 | 0.0 | 18.1037 | 0 | [425, 393] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_214224__913.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1671 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231213_232006__495 | 0 | 0.0 | 12.6386 | 0 | [95, 369] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231213_232006__495.json | 0.0 | missing | missing | missing | |
| 1672 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231224_224431__743 | 0 | 0.0 | 8.15119 | 0 | [114, 253] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231224_224431__743.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1673 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231224_224438__677 | 0 | 0.0 | 7.4447 | 0 | [114, 230] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231224_224438__677.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1674 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | InJulia | 1SHOT | false | false | 5 | 20231213_231953__596 | 0 | 0.0 | 19.4905 | 0 | [112, 556] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231213_231953__596.json | 0.0 | missing | missing | missing | |
| 1675 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231224_224416__986 | 0 | 0.0 | 7.62096 | 0 | [117, 235] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231224_224416__986.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1676 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231224_224422__889 | 0 | 0.0 | 5.86776 | 0 | [117, 177] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231224_224422__889.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1677 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231226_211733__576 | 0 | 0.0 | 9.62439 | 0 | [117, 302] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231226_211733__576.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1678 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231213_231934__963 | 0 | 0.0 | 15.021 | 0 | [141, 420] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231213_231934__963.json | 0.0 | missing | missing | missing | |
| 1679 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_224358__556 | 0 | 0.0 | 10.6317 | 0 | [158, 327] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231224_224358__556.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1680 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_224408__499 | 0 | 0.0 | 9.63523 | 1 | [158, 296] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231224_224408__499.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1681 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_211723__898 | 0 | 0.0 | 6.36802 | 1 | [158, 189] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231226_211723__898.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1682 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_231919__844 | 0 | 0.0 | 21.1151 | 0 | [311, 516] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231213_231919__844.json | 0.0 | missing | missing | missing | |
| 1683 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231224_224329__341 | 0 | 0.0 | 25.7076 | 0 | [338, 593] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231224_224329__341.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1684 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231224_224346__779 | 0 | 0.0 | 17.3167 | 0 | [338, 498] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231224_224346__779.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1685 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_211716__848 | 0 | 0.0 | 18.334 | 0 | [338, 374] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231226_211716__848.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1686 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231213_232046__172 | 0 | 0.0 | 17.6245 | 0 | [11, 475] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231213_232046__172.json | 0.0 | missing | missing | missing | |
| 1687 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_224534__730 | 0 | 0.0 | 21.3382 | 0 | [425, 599] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231224_224534__730.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1688 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_224547__734 | 0 | 0.0 | 12.872 | 0 | [425, 343] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231224_224547__734.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1689 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_211747__237 | 0 | 0.0 | 2.44985 | 0 | [425, 15] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231226_211747__237.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1690 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | false | 5 | 20231213_232028__384 | 0 | 0.0 | 22.3044 | 0 | [412, 501] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231213_232028__384.json | 25.0 | missing | missing | missing | |
| 1691 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_224456__348 | 0 | 0.0 | 17.4095 | 0 | [423, 484] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231224_224456__348.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1692 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | false | false | 5 | 20231224_224512__386 | 0 | 0.0 | 16.4644 | 0 | [423, 456] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231224_224512__386.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1693 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_211744__202 | 4 | 0.0 | 10.4729 | 2 | [423, 272] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231226_211744__202.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1694 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231213_233314__862 | 0 | 0.0 | 24.4595 | 0 | [95, 693] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__AsIs__1SHOT__20231213_233314__862.json | 0.0 | missing | missing | missing | |
| 1695 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231224_231109__854 | 0 | 0.0 | 14.9609 | 0 | [112, 263] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__AsIs__1SHOT__20231224_231109__854.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1696 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231224_231128__870 | 0 | 0.0 | 18.3097 | 0 | [112, 325] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__AsIs__1SHOT__20231224_231128__870.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1697 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231213_233249__106 | 0 | 0.0 | 13.9883 | 0 | [112, 402] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__InJulia__1SHOT__20231213_233249__106.json | 0.0 | missing | missing | missing | |
| 1698 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231224_231038__295 | 0 | 0.0 | 21.4286 | 0 | [115, 382] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__InJulia__1SHOT__20231224_231038__295.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1699 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231224_231054__612 | 0 | 0.0 | 15.9452 | 0 | [115, 281] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__InJulia__1SHOT__20231224_231054__612.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1700 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231226_212718__778 | 0 | 0.0 | 2.96316 | 0 | [115, 37] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__InJulia__1SHOT__20231226_212718__778.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1701 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231213_233235__673 | 0 | 0.0 | 8.37548 | 0 | [141, 227] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231213_233235__673.json | 0.0 | missing | missing | missing | |
| 1702 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_231008__340 | 0 | 0.0 | 7.60282 | 0 | [154, 120] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231224_231008__340.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1703 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_231017__392 | 0 | 0.0 | 8.54455 | 0 | [154, 137] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231224_231017__392.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1704 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_212715__465 | 0 | 0.0 | 8.50667 | 0 | [154, 136] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231226_212715__465.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1705 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_233227__412 | 0 | 0.0 | 26.1747 | 0 | [311, 644] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231213_233227__412.json | 0.0 | missing | missing | missing | |
| 1706 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_230944__455 | 0 | 0.0 | 32.0509 | 0 | [324, 354] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231224_230944__455.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1707 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_231001__159 | 0 | 0.0 | 17.0012 | 0 | [324, 260] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231224_231001__159.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1708 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_212706__286 | 0 | 0.0 | 22.6313 | 0 | [324, 199] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231226_212706__286.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1709 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_233354__787 | 0 | 0.0 | 15.8157 | 0 | [11, 429] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231213_233354__787.json | 50.0 | missing | missing | missing | |
| 1710 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231224_231257__882 | 0 | 0.0 | 28.0475 | 0 | [418, 435] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231224_231257__882.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1711 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231224_231324__848 | 0 | 0.0 | 27.0871 | 0 | [418, 418] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231224_231324__848.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1712 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_212834__337 | 0 | 0.0 | 42.8094 | 0 | [418, 685] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231226_212834__337.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1713 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_233338__868 | 0 | 0.0 | 24.794 | 0 | [412, 564] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231213_233338__868.json | 0.0 | missing | missing | missing | |
| 1714 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_231205__714 | 0 | 0.0 | 37.4836 | 0 | [415, 598] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231224_231205__714.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1715 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_231229__971 | 0 | 0.0 | 23.2519 | 0 | [415, 356] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231224_231229__971.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1716 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_212751__683 | 0 | 0.0 | 33.8862 | 0 | [415, 540] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231226_212751__683.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1717 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231219_214429__882 | 0 | 0.0 | 22.1699 | 0 | [95, 635] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231219_214429__882.json | 0.0 | missing | missing | missing | |
| 1718 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231219_214443__556 | 0 | 0.0 | 13.8019 | 0 | [1, 422] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231219_214443__556.json | 0.0 | missing | missing | missing | |
| 1719 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231219_214506__736 | 0 | 0.0 | 23.455 | 0 | [1, 685] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231219_214506__736.json | 0.0 | missing | missing | missing | |
| 1720 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231224_234257__801 | 0 | 0.0 | 24.3156 | 0 | [109, 892] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231224_234257__801.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1721 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231224_234300__641 | 0 | 0.0 | 2.25811 | 0 | [109, 78] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231224_234300__641.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1722 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | true | false | 5 | 20231219_214326__954 | 0 | 0.0 | 15.061 | 0 | [112, 435] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_214326__954.json | 25.0 | missing | missing | missing | |
| 1723 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231219_214344__971 | 0 | 0.0 | 17.7868 | 0 | [1, 533] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_214344__971.json | 0.0 | missing | missing | missing | |
| 1724 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | true | false | 5 | 20231219_214407__151 | 0 | 0.0 | 23.1137 | 0 | [1, 676] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_214407__151.json | 25.0 | missing | missing | missing | |
| 1725 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231224_234221__551 | 0 | 0.0 | 24.0657 | 0 | [112, 883] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231224_234221__551.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1726 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231224_234233__458 | 0 | 0.0 | 11.6301 | 0 | [112, 441] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231224_234233__458.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1727 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_214306__922 | 0 | 0.0 | 15.6848 | 0 | [1, 469] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_214306__922.json | 0.0 | missing | missing | missing | |
| 1728 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_214311__611 | 0 | 0.0 | 4.92129 | 0 | [1, 155] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_214311__611.json | 0.0 | missing | missing | missing | |
| 1729 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_234152__302 | 0 | 0.0 | 15.8027 | 0 | [149, 586] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231224_234152__302.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1730 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_234157__339 | 0 | 0.0 | 5.01382 | 0 | [149, 182] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231224_234157__339.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1731 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_214323__915 | 0 | 0.0 | 29.0251 | 0 | [149, 1035] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_214323__915.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1732 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_214224__855 | 0 | 0.0 | 14.7061 | 0 | [1, 419] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_214224__855.json | 0.0 | missing | missing | missing | |
| 1733 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_214240__884 | 0 | 0.0 | 15.4339 | 0 | [1, 438] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_214240__884.json | 0.0 | missing | missing | missing | |
| 1734 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_234132__894 | 0 | 0.0 | 8.79392 | 0 | [318, 157] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231224_234132__894.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1735 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_234137__267 | 0 | 0.0 | 4.35688 | 0 | [318, 130] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231224_234137__267.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1736 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_214254__220 | 0 | 0.0 | 9.13263 | 0 | [318, 179] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_214254__220.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1737 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_214633__208 | 0 | 0.0 | 21.0881 | 0 | [11, 564] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_214633__208.json | 0.0 | missing | missing | missing | |
| 1738 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_214651__480 | 3 | 0.0 | 17.886 | 2 | [1, 489] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_214651__480.json | 90.0 | missing | missing | missing | |
| 1739 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_214708__999 | 0 | 0.0 | 17.2944 | 0 | [1, 474] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_214708__999.json | 0.0 | missing | missing | missing | |
| 1740 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_234340__185 | 0 | 0.0 | 10.5159 | 0 | [401, 340] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231224_234340__185.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1741 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231224_234349__720 | 0 | 0.0 | 8.87795 | 0 | [401, 281] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231224_234349__720.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1742 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_214530__938 | 0 | 0.0 | 23.4796 | 0 | [412, 533] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_214530__938.json | 50.0 | missing | missing | missing | |
| 1743 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_214554__571 | 0 | 0.0 | 23.8673 | 0 | [1, 638] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_214554__571.json | 50.0 | missing | missing | missing | |
| 1744 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_214612__400 | 0 | 0.0 | 17.754 | 0 | [1, 486] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_214612__400.json | 50.0 | missing | missing | missing | |
| 1745 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231224_234326__790 | 0 | 0.0 | 26.2133 | 0 | [398, 872] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231224_234326__790.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1746 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_234329__784 | 0 | 0.0 | 3.59231 | 0 | [398, 86] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231224_234329__784.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1747 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231213_233454__338 | 0 | 0.0 | 18.6451 | 0 | [95, 537] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231213_233454__338.json | 0.0 | missing | missing | missing | |
| 1748 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231224_231853__694 | 0 | 0.0 | 34.6505 | 0 | [120, 260] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231224_231853__694.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1749 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231224_231939__939 | 0 | 0.0 | 45.9565 | 0 | [120, 350] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231224_231939__939.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1750 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | InJulia | 1SHOT | true | false | 5 | 20231213_233436__898 | 0 | 0.0 | 15.124 | 0 | [112, 434] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231213_233436__898.json | 25.0 | missing | missing | missing | |
| 1751 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231224_231743__437 | 4 | 0.0 | 54.3185 | 2 | [123, 416] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231224_231743__437.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1752 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231224_231818__635 | 5 | 0.0 | 33.6594 | 2 | [123, 252] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231224_231818__635.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1753 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | InJulia | 1SHOT | false | false | 5 | 20231226_213108__692 | 0 | 0.0 | 33.0507 | 0 | [123, 246] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231226_213108__692.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1754 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_233421__636 | 0 | 0.0 | 4.65878 | 0 | [141, 115] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231213_233421__636.json | 50.0 | missing | missing | missing | |
| 1755 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_231559__406 | 5 | 0.0 | 23.5117 | 2 | [162, 160] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231224_231559__406.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1756 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_231649__993 | 0 | 0.0 | 49.315 | 0 | [162, 365] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231224_231649__993.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1757 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_213034__973 | 0 | 0.0 | 45.1372 | 0 | [162, 331] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231226_213034__973.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1758 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_233416__661 | 0 | 0.0 | 21.0427 | 0 | [311, 513] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231213_233416__661.json | 0.0 | missing | missing | missing | |
| 1759 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231224_231454__311 | 0 | 0.0 | 90.6269 | 0 | [332, 477] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231224_231454__311.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1760 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231224_231535__880 | 0 | 0.0 | 40.2175 | 0 | [332, 263] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231224_231535__880.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1761 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_212949__376 | 5 | 0.0 | 74.5487 | 2 | [332, 366] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_212949__376.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1762 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231213_233540__548 | 0 | 0.0 | 20.5867 | 0 | [11, 549] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231213_233540__548.json | 25.0 | missing | missing | missing | |
| 1763 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_232147__405 | 5 | 0.0 | 53.5722 | 2 | [426, 346] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231224_232147__405.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1764 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231224_232252__494 | 0 | 0.0 | 64.9799 | 0 | [426, 433] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231224_232252__494.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1765 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_213231__403 | 5 | 0.0 | 45.0821 | 2 | [426, 281] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_213231__403.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1766 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_233520__723 | 0 | 0.0 | 25.2515 | 0 | [412, 575] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231213_233520__723.json | 0.0 | missing | missing | missing | |
| 1767 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_232005__484 | 0 | 0.0 | 25.9733 | 0 | [423, 134] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231224_232005__484.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1768 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231224_232053__991 | 0 | 0.0 | 47.9716 | 0 | [423, 305] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231224_232053__991.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1769 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_213146__866 | 5 | 0.0 | 37.6451 | 2 | [423, 225] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231226_213146__866.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1770 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_212804__826 | 0 | 0.0 | 15.4424 | 0 | [95, 451] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231219_212804__826.json | 0.0 | missing | missing | missing | |
| 1771 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_212826__802 | 0 | 0.0 | 21.9907 | 0 | [1, 647] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231219_212826__802.json | 0.0 | missing | missing | missing | |
| 1772 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_212845__224 | 0 | 0.0 | 19.0576 | 0 | [1, 568] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231219_212845__224.json | 0.0 | missing | missing | missing | |
| 1773 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231224_233457__845 | 0 | 0.0 | 11.1693 | 0 | [116, 179] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231224_233457__845.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1774 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231224_233513__625 | 0 | 0.0 | 15.1668 | 0 | [116, 249] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231224_233513__625.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1775 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_212733__752 | 0 | 0.0 | 15.7299 | 0 | [1, 476] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_212733__752.json | 0.0 | missing | missing | missing | |
| 1776 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_212748__114 | 0 | 0.0 | 15.0261 | 0 | [1, 456] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_212748__114.json | 25.0 | missing | missing | missing | |
| 1777 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231224_233432__346 | 0 | 0.0 | 18.7286 | 0 | [119, 310] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231224_233432__346.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1778 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231224_233446__959 | 0 | 0.0 | 13.5703 | 0 | [119, 221] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231224_233446__959.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1779 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231226_213853__919 | 0 | 0.0 | 13.2754 | 0 | [119, 216] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231226_213853__919.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1780 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_212648__356 | 0 | 0.0 | 11.022 | 0 | [1, 337] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_212648__356.json | 0.0 | missing | missing | missing | |
| 1781 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_212656__375 | 0 | 0.0 | 8.73431 | 0 | [1, 270] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_212656__375.json | 0.0 | missing | missing | missing | |
| 1782 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_233358__844 | 0 | 0.0 | 13.7336 | 0 | [160, 219] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_233358__844.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1783 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_233413__359 | 0 | 0.0 | 15.1158 | 0 | [160, 243] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_233413__359.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1784 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_213839__487 | 0 | 0.0 | 9.82033 | 0 | [160, 151] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_213839__487.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1785 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_212551__205 | 0 | 0.0 | 9.93304 | 0 | [1, 289] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_212551__205.json | 0.0 | missing | missing | missing | |
| 1786 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_212617__795 | 0 | 0.0 | 25.8143 | 0 | [1, 703] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_212617__795.json | 0.0 | missing | missing | missing | |
| 1787 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_233326__936 | 0 | 0.0 | 28.0832 | 0 | [340, 262] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_233326__936.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1788 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_233344__659 | 0 | 0.0 | 18.8453 | 0 | [340, 276] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_233344__659.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1789 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_213829__156 | 0 | 0.0 | 30.6923 | 0 | [340, 327] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_213829__156.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1790 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_213040__609 | 0 | 0.0 | 28.087 | 1 | [1, 739] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_213040__609.json | 62.5 | missing | missing | missing | |
| 1791 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_213041__202 | 0 | 0.0 | 1.4005 | 0 | [1, 41] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_213041__202.json | 0.0 | missing | missing | missing | |
| 1792 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231224_233607__846 | 0 | 0.0 | 17.0352 | 0 | [427, 231] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_233607__846.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1793 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231224_233624__405 | 0 | 0.0 | 16.2612 | 0 | [427, 218] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_233624__405.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1794 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231226_213937__216 | 0 | 0.0 | 24.5057 | 0 | [427, 355] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_213937__216.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1795 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_212929__969 | 0 | 0.0 | 24.7569 | 0 | [1, 660] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_212929__969.json | 50.0 | missing | missing | missing | |
| 1796 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_212953__314 | 0 | 0.0 | 23.4219 | 0 | [1, 627] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_212953__314.json | 50.0 | missing | missing | missing | |
| 1797 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_233534__311 | 0 | 0.0 | 21.0067 | 0 | [425, 297] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_233534__311.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1798 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_233550__155 | 0 | 0.0 | 16.62 | 0 | [425, 224] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_233550__155.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1799 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_213913__900 | 0 | 0.0 | 20.1565 | 0 | [425, 283] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_213913__900.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1800 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231213_233113__411 | 0 | 0.0 | 19.2756 | 0 | [95, 555] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__AsIs__1SHOT__20231213_233113__411.json | 0.0 | missing | missing | missing | |
| 1801 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231224_230832__226 | 0 | 0.0 | 5.4051 | 0 | [115, 299] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__AsIs__1SHOT__20231224_230832__226.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1802 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231224_230837__791 | 0 | 0.0 | 4.68443 | 0 | [115, 259] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__AsIs__1SHOT__20231224_230837__791.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1803 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231213_233053__748 | 0 | 0.0 | 13.593 | 0 | [112, 391] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__InJulia__1SHOT__20231213_233053__748.json | 0.0 | missing | missing | missing | |
| 1804 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231224_230823__904 | 0 | 0.0 | 5.81542 | 0 | [118, 321] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__InJulia__1SHOT__20231224_230823__904.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1805 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | InJulia | 1SHOT | true | true | 5 | 20231224_230826__952 | 0 | 0.0 | 3.78288 | 0 | [118, 207] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__InJulia__1SHOT__20231224_230826__952.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1806 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | InJulia | 1SHOT | true | true | 5 | 20231226_212629__169 | 0 | 0.0 | 3.3126 | 0 | [118, 180] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__InJulia__1SHOT__20231226_212629__169.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1807 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231213_233040__791 | 0 | 0.0 | 10.8457 | 0 | [141, 300] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231213_233040__791.json | 0.0 | missing | missing | missing | |
| 1808 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_230812__434 | 0 | 0.0 | 2.77818 | 0 | [155, 142] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231224_230812__434.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1809 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_230817__108 | 0 | 0.0 | 4.43217 | 0 | [155, 236] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231224_230817__108.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1810 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_212626__925 | 0 | 0.0 | 2.25792 | 0 | [155, 112] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231226_212626__925.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1811 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_233029__960 | 0 | 0.0 | 21.2506 | 0 | [311, 519] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231213_233029__960.json | 0.0 | missing | missing | missing | |
| 1812 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_230802__781 | 0 | 0.0 | 8.81938 | 0 | [321, 270] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231224_230802__781.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1813 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_230810__717 | 0 | 0.0 | 8.06067 | 0 | [321, 385] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231224_230810__717.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1814 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_212624__329 | 0 | 0.0 | 18.5575 | 0 | [321, 766] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231226_212624__329.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1815 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231213_233201__833 | 0 | 0.0 | 25.6671 | 0 | [11, 674] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231213_233201__833.json | 0.0 | missing | missing | missing | |
| 1816 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_230900__549 | 0 | 0.0 | 8.13601 | 0 | [405, 367] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231224_230900__549.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1817 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231224_230912__225 | 0 | 0.0 | 11.3215 | 0 | [405, 522] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231224_230912__225.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1818 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_212643__409 | 0 | 0.0 | 6.62203 | 0 | [405, 290] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231226_212643__409.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1819 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_233134__861 | 0 | 0.0 | 21.8603 | 0 | [412, 490] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231213_233134__861.json | 50.0 | missing | missing | missing | |
| 1820 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_230844__933 | 0 | 0.0 | 7.24821 | 0 | [403, 322] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231224_230844__933.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1821 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231224_230852__414 | 0 | 0.0 | 7.95047 | 0 | [403, 358] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231224_230852__414.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1822 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | false | 5 | 20231226_212637__885 | 0 | 0.0 | 7.48784 | 0 | [403, 335] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231226_212637__885.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1823 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231213_232151__978 | 0 | 0.0 | 9.67541 | 0 | [95, 282] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__AsIs__1SHOT__20231213_232151__978.json | 0.0 | missing | missing | missing | |
| 1824 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231224_224723__229 | 0 | 0.0 | 8.92212 | 0 | [116, 277] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__AsIs__1SHOT__20231224_224723__229.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1825 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231224_224732__199 | 0 | 0.0 | 8.95006 | 0 | [116, 279] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__AsIs__1SHOT__20231224_224732__199.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1826 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | InJulia | 1SHOT | false | false | 5 | 20231213_232142__371 | 0 | 0.0 | 17.2403 | 0 | [112, 494] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__InJulia__1SHOT__20231213_232142__371.json | 0.0 | missing | missing | missing | |
| 1827 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231224_224702__570 | 0 | 0.0 | 10.3447 | 0 | [119, 322] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__InJulia__1SHOT__20231224_224702__570.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1828 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231224_224713__996 | 0 | 0.0 | 11.1954 | 0 | [119, 351] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__InJulia__1SHOT__20231224_224713__996.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1829 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231226_211838__359 | 0 | 0.0 | 6.86297 | 0 | [119, 210] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__InJulia__1SHOT__20231226_211838__359.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1830 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231213_232124__248 | 0 | 0.0 | 17.4209 | 0 | [141, 487] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231213_232124__248.json | 0.0 | missing | missing | missing | |
| 1831 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_224643__370 | 0 | 0.0 | 11.9172 | 0 | [160, 369] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231224_224643__370.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1832 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_224651__372 | 0 | 0.0 | 7.36829 | 0 | [160, 218] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231224_224651__372.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1833 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_211831__297 | 0 | 0.0 | 21.8422 | 0 | [160, 681] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231226_211831__297.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1834 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_232107__914 | 0 | 0.0 | 20.8324 | 0 | [311, 509] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231213_232107__914.json | 0.0 | missing | missing | missing | |
| 1835 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_224617__476 | 0 | 0.0 | 30.7815 | 0 | [340, 735] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231224_224617__476.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1836 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_224631__187 | 0 | 0.0 | 14.0719 | 0 | [340, 400] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231224_224631__187.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1837 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_211809__923 | 0 | 0.0 | 22.2716 | 0 | [340, 495] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231226_211809__923.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1838 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_232239__130 | 0 | 0.0 | 20.3745 | 0 | [11, 544] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231213_232239__130.json | 50.0 | missing | missing | missing | |
| 1839 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231224_224817__594 | 0 | 0.0 | 16.7572 | 0 | [427, 463] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231224_224817__594.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1840 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_224838__377 | 0 | 0.0 | 20.8225 | 0 | [427, 587] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231224_224838__377.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1841 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_211911__651 | 0 | 0.0 | 13.5888 | 0 | [427, 368] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231226_211911__651.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1842 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaRecapTask | 1SHOT | true | false | 5 | 20231213_232219__385 | 0 | 0.0 | 27.7508 | 0 | [412, 637] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231213_232219__385.json | 25.0 | missing | missing | missing | |
| 1843 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_224745__587 | 0 | 0.0 | 13.4577 | 0 | [425, 363] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231224_224745__587.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1844 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_224800__824 | 0 | 0.0 | 14.5616 | 0 | [425, 397] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231224_224800__824.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1845 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_211858__692 | 0 | 0.0 | 19.481 | 0 | [425, 548] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231226_211858__692.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1846 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231213_232345__756 | 0 | 0.0 | 18.6151 | 0 | [95, 537] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__AsIs__1SHOT__20231213_232345__756.json | 0.0 | missing | missing | missing | |
| 1847 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231224_225336__763 | 0 | 0.0 | 35.0649 | 0 | [113, 254] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__AsIs__1SHOT__20231224_225336__763.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1848 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231224_225416__116 | 0 | 0.0 | 39.6968 | 0 | [113, 289] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__AsIs__1SHOT__20231224_225416__116.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1849 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231213_232326__504 | 0 | 0.0 | 20.5142 | 0 | [112, 584] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__InJulia__1SHOT__20231213_232326__504.json | 25.0 | missing | missing | missing | |
| 1850 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231224_225221__793 | 0 | 0.0 | 37.5888 | 0 | [116, 273] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__InJulia__1SHOT__20231224_225221__793.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1851 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231224_225301__996 | 0 | 0.0 | 39.6453 | 0 | [116, 289] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__InJulia__1SHOT__20231224_225301__996.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1852 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231226_212155__973 | 0 | 0.0 | 49.9478 | 0 | [116, 368] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__InJulia__1SHOT__20231226_212155__973.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1853 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231213_232305__263 | 0 | 0.0 | 10.2219 | 0 | [141, 282] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231213_232305__263.json | 0.0 | missing | missing | missing | |
| 1854 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_225120__521 | 0 | 0.0 | 33.8258 | 1 | [155, 238] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231224_225120__521.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1855 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_225143__501 | 0 | 0.0 | 23.3089 | 0 | [155, 157] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231224_225143__501.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1856 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_212105__910 | 0 | 0.0 | 30.023 | 0 | [155, 210] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231226_212105__910.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1857 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_232255__458 | 0 | 0.0 | 15.7184 | 0 | [311, 373] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231213_232255__458.json | 0.0 | missing | missing | missing | |
| 1858 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231224_224932__141 | 0 | 0.0 | 54.1286 | 0 | [343, 164] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231224_224932__141.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1859 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_225045__451 | 0 | 0.0 | 72.7819 | 0 | [343, 490] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231224_225045__451.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1860 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_212035__514 | 0 | 0.0 | 82.9453 | 0 | [343, 404] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231226_212035__514.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1861 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231213_232434__585 | 0 | 0.0 | 17.3704 | 0 | [11, 468] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231213_232434__585.json | 25.0 | missing | missing | missing | |
| 1862 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231224_225700__945 | 0 | 0.0 | 52.2443 | 0 | [429, 322] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231224_225700__945.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1863 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_225826__331 | 0 | 0.0 | 85.5014 | 0 | [429, 562] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231224_225826__331.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1864 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231226_212329__461 | 0 | 0.0 | 41.3752 | 0 | [429, 243] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231226_212329__461.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1865 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_232416__618 | 0 | 0.0 | 31.815 | 0 | [412, 733] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231213_232416__618.json | 0.0 | missing | missing | missing | |
| 1866 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231224_225518__574 | 0 | 0.0 | 62.5352 | 0 | [427, 397] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231224_225518__574.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1867 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231224_225608__381 | 0 | 0.0 | 49.3816 | 0 | [427, 301] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231224_225608__381.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1868 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_212247__769 | 0 | 0.0 | 52.0133 | 0 | [427, 322] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231226_212247__769.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1869 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231213_234216__258 | 0 | 0.0 | 13.9401 | 0 | [44, 421] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231213_234216__258.json | 0.0 | missing | missing | missing | |
| 1870 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231225_000158__609 | 0 | 0.0 | 2.66902 | 0 | [66, 36] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_000158__609.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1871 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231225_000202__158 | 0 | 0.0 | 4.58032 | 0 | [66, 73] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_000202__158.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1872 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | InJulia | 1SHOT | false | false | 5 | 20231213_234202__292 | 0 | 0.0 | 14.4885 | 0 | [60, 435] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231213_234202__292.json | 0.0 | missing | missing | missing | |
| 1873 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_000132__115 | 0 | 0.0 | 13.9409 | 0 | [68, 251] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_000132__115.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1874 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_000155__638 | 0 | 0.0 | 23.0082 | 0 | [68, 420] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_000155__638.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1875 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231226_220105__496 | 0 | 0.0 | 12.926 | 0 | [68, 233] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231226_220105__496.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1876 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_234148__621 | 0 | 0.0 | 6.81969 | 0 | [90, 196] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231213_234148__621.json | 50.0 | missing | missing | missing | |
| 1877 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_000114__853 | 0 | 0.0 | 7.30339 | 0 | [107, 120] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_000114__853.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1878 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_000118__488 | 0 | 0.0 | 3.53673 | 0 | [107, 48] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_000118__488.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1879 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_220052__882 | 0 | 0.0 | 3.35666 | 0 | [107, 45] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231226_220052__882.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1880 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_234141__159 | 0 | 0.0 | 14.3533 | 0 | [201, 379] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231213_234141__159.json | 50.0 | missing | missing | missing | |
| 1881 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_000058__606 | 0 | 0.0 | 14.7881 | 0 | [219, 57] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_000058__606.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1882 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_000107__340 | 0 | 0.0 | 8.9092 | 0 | [219, 134] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_000107__340.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1883 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_220048__688 | 0 | 0.0 | 12.8722 | 0 | [219, 33] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231226_220048__688.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1884 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231213_234257__422 | 0 | 0.0 | 17.814 | 0 | [11, 487] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231213_234257__422.json | 25.0 | missing | missing | missing | |
| 1885 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_000257__381 | 0 | 0.0 | 8.87821 | 0 | [372, 107] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_000257__381.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1886 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_000318__396 | 0 | 0.0 | 21.0202 | 0 | [372, 325] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_000318__396.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1887 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_220138__485 | 0 | 0.0 | 11.7943 | 0 | [372, 161] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231226_220138__485.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1888 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | false | 5 | 20231213_234239__749 | 0 | 0.0 | 22.5905 | 0 | [361, 531] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231213_234239__749.json | 25.0 | missing | missing | missing | |
| 1889 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_000229__766 | 0 | 0.0 | 26.1696 | 0 | [369, 416] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_000229__766.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1890 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_000248__177 | 0 | 0.0 | 19.1194 | 0 | [369, 291] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_000248__177.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1891 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | false | 5 | 20231226_220126__377 | 0 | 0.0 | 21.4502 | 0 | [369, 335] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231226_220126__377.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1892 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20240131_221726__457 | 0 | 0.0 | 5.08468 | 0 | [0, 387] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_221726__457.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1893 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_221731__309 | 1 | 0.0 | 4.89962 | 1 | [0, 373] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_221731__309.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1894 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_221735__425 | 0 | 0.0 | 4.08635 | 0 | [0, 312] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_221735__425.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1895 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_221740__389 | 0 | 0.0 | 4.4468 | 0 | [0, 339] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_221740__389.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1896 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20240131_221744__106 | 0 | 0.0 | 4.01451 | 0 | [0, 306] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_221744__106.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1897 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_221657__211 | 1 | 0.0 | 1.12024 | 2 | [0, 83] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_221657__211.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1898 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_221658__350 | 1 | 0.0 | 0.606657 | 2 | [0, 45] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_221658__350.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1899 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_221658__534 | 1 | 0.0 | 0.477753 | 2 | [0, 36] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_221658__534.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1900 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_221659__794 | 1 | 0.0 | 0.47369 | 2 | [0, 36] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_221659__794.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1901 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240131_221701__767 | 0 | 0.0 | 1.94316 | 0 | [0, 148] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_221701__767.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1902 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_221639__903 | 0 | 0.0 | 0.624376 | 0 | [0, 47] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_221639__903.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1903 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240131_221644__650 | 0 | 0.0 | 5.20953 | 0 | [0, 383] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_221644__650.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1904 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_221645__584 | 0 | 0.0 | 0.804675 | 0 | [0, 60] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_221645__584.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1905 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_221648__909 | 1 | 0.0 | 3.01729 | 2 | [0, 225] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_221648__909.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1906 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_221649__494 | 0 | 0.0 | 1.21626 | 0 | [0, 91] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_221649__494.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1907 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240131_221856__970 | 0 | 0.0 | 5.33662 | 0 | [0, 396] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_221856__970.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1908 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_221901__447 | 0 | 0.0 | 4.75279 | 0 | [0, 353] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_221901__447.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1909 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_221905__396 | 0 | 0.0 | 4.16129 | 0 | [0, 310] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_221905__396.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1910 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_221908__667 | 1 | 0.0 | 2.49843 | 2 | [0, 187] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_221908__667.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1911 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_221913__263 | 0 | 0.0 | 5.1168 | 0 | [0, 379] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_221913__263.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1912 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_221811__692 | 0 | 0.0 | 10.5936 | 2 | [0, 769] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_221811__692.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1913 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_221817__541 | 0 | 0.0 | 5.83045 | 0 | [0, 429] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_221817__541.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1914 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_221824__424 | 0 | 0.0 | 6.43567 | 0 | [0, 475] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_221824__424.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1915 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_221825__238 | 0 | 0.0 | 0.599722 | 0 | [0, 45] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_221825__238.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1916 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_221828__953 | 0 | 0.0 | 3.73117 | 0 | [0, 276] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_221828__953.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1917 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231213_234402__401 | 0 | 0.0 | 18.4332 | 0 | [44, 548] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__AsIs__1SHOT__20231213_234402__401.json | 0.0 | missing | missing | missing | |
| 1918 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231225_000452__897 | 0 | 0.0 | 8.46897 | 0 | [39, 153] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__AsIs__1SHOT__20231225_000452__897.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1919 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231225_000459__369 | 0 | 0.0 | 7.10981 | 0 | [39, 127] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__AsIs__1SHOT__20231225_000459__369.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1920 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231213_234344__984 | 0 | 0.0 | 13.6866 | 0 | [60, 412] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__InJulia__1SHOT__20231213_234344__984.json | 0.0 | missing | missing | missing | |
| 1921 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_000411__226 | 0 | 0.0 | 29.6418 | 0 | [42, 546] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_000411__226.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1922 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_000443__300 | 0 | 0.0 | 31.8764 | 0 | [42, 586] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_000443__300.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1923 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_234330__817 | 0 | 0.0 | 15.908 | 0 | [90, 464] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231213_234330__817.json | 50.0 | missing | missing | missing | |
| 1924 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_000336__692 | 0 | 0.0 | 4.16003 | 0 | [44, 70] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_000336__692.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1925 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_000342__555 | 0 | 0.0 | 5.93173 | 0 | [44, 104] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_000342__555.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1926 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_234314__207 | 0 | 0.0 | 17.442 | 0 | [201, 464] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231213_234314__207.json | 50.0 | missing | missing | missing | |
| 1927 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_000331__819 | 0 | 0.0 | 0.877273 | 0 | [94, 1] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_000331__819.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1928 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_000331__991 | 0 | 0.0 | 12.5777 | 0 | [94, 30] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_000331__991.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1929 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_234443__685 | 1 | 0.0 | 19.5067 | 1 | [11, 530] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231213_234443__685.json | 67.5 | missing | missing | missing | |
| 1930 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_000512__617 | 0 | 0.0 | 2.31192 | 0 | [61, 34] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_000512__617.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1931 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_000515__591 | 0 | 0.0 | 2.8428 | 0 | [61, 44] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_000515__591.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1932 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_234424__549 | 0 | 0.0 | 21.3398 | 0 | [361, 500] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231213_234424__549.json | 50.0 | missing | missing | missing | |
| 1933 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_000503__706 | 0 | 0.0 | 4.32742 | 0 | [58, 73] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_000503__706.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1934 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_000510__526 | 0 | 0.0 | 6.56486 | 0 | [58, 116] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_000510__526.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1935 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_222417__692 | 0 | 0.0 | 8.86668 | 0 | [0, 318] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_222417__692.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1936 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_222423__300 | 0 | 0.0 | 5.82523 | 0 | [0, 209] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_222423__300.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1937 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_222430__520 | 0 | 0.0 | 7.67242 | 0 | [0, 275] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_222430__520.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1938 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_222440__942 | 0 | 0.0 | 9.91481 | 0 | [0, 355] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_222440__942.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1939 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_222445__110 | 0 | 0.0 | 4.9588 | 0 | [0, 178] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_222445__110.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1940 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_222246__379 | 0 | 0.0 | 5.99764 | 0 | [0, 215] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_222246__379.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1941 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_222250__810 | 0 | 0.0 | 4.23244 | 0 | [0, 152] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_222250__810.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1942 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_222301__480 | 0 | 0.0 | 10.4617 | 0 | [0, 374] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_222301__480.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1943 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_222311__509 | 0 | 0.0 | 10.6299 | 0 | [0, 380] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_222311__509.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1944 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_222318__280 | 0 | 0.0 | 6.80082 | 0 | [0, 243] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_222318__280.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1945 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240131_222151__280 | 0 | 0.0 | 6.94898 | 0 | [0, 249] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_222151__280.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1946 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_222159__320 | 0 | 0.0 | 8.54714 | 0 | [0, 306] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_222159__320.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1947 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_222205__709 | 0 | 0.0 | 6.1763 | 0 | [0, 221] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_222205__709.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1948 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240131_222208__655 | 0 | 0.0 | 2.41869 | 0 | [0, 87] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_222208__655.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1949 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240131_222214__486 | 0 | 0.0 | 6.4597 | 0 | [0, 231] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_222214__486.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1950 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_222543__860 | 0 | 0.0 | 0.116373 | 0 | [0, 4] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_222543__860.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1951 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_222544__742 | 0 | 0.0 | 0.116615 | 0 | [0, 4] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_222544__742.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1952 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_222553__423 | 0 | 0.0 | 9.59558 | 0 | [0, 338] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_222553__423.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1953 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_222608__591 | 0 | 0.0 | 15.24 | 0 | [0, 539] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_222608__591.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1954 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_222609__702 | 0 | 0.0 | 0.12261 | 0 | [0, 4] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_222609__702.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1955 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_222527__142 | 1 | 0.0 | 5.71864 | 2 | [0, 203] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_222527__142.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1956 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240131_222527__398 | 0 | 0.0 | 0.143803 | 0 | [0, 5] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_222527__398.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1957 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240131_222527__411 | 0 | 0.0 | 0.143787 | 0 | [0, 5] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_222527__411.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1958 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240131_222527__416 | 0 | 0.0 | 0.143908 | 0 | [0, 5] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_222527__416.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1959 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240131_222527__526 | 0 | 0.0 | 0.14408 | 0 | [0, 5] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_222527__526.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1960 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 5 | 20240131_221318__785 | 0 | 0.0 | 5.43122 | 0 | [0, 134] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240131_221318__785.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1961 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240131_221321__960 | 0 | 0.0 | 2.42292 | 0 | [0, 60] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240131_221321__960.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1962 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 5 | 20240131_221326__983 | 0 | 0.0 | 5.45811 | 0 | [0, 135] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240131_221326__983.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1963 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240131_221335__724 | 0 | 0.0 | 8.22364 | 0 | [0, 203] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240131_221335__724.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1964 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240131_221343__565 | 0 | 0.0 | 8.18607 | 0 | [0, 202] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240131_221343__565.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1965 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_221204__169 | 0 | 0.0 | 8.81212 | 0 | [0, 217] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_221204__169.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1966 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_221213__524 | 0 | 0.0 | 9.50409 | 0 | [0, 234] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_221213__524.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1967 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_221218__237 | 0 | 0.0 | 4.25757 | 0 | [0, 105] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_221218__237.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1968 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_221223__659 | 0 | 0.0 | 5.59417 | 2 | [0, 138] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_221223__659.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1969 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_221237__126 | 0 | 0.0 | 13.4107 | 0 | [0, 330] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_221237__126.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1970 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_221058__258 | 0 | 0.0 | 6.34817 | 0 | [0, 155] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_221058__258.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1971 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_221103__200 | 0 | 0.0 | 5.10297 | 0 | [0, 125] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_221103__200.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1972 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_221111__893 | 0 | 0.0 | 8.43244 | 0 | [0, 206] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_221111__893.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1973 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_221116__127 | 0 | 0.0 | 5.2028 | 0 | [0, 128] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_221116__127.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1974 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_221124__379 | 0 | 0.0 | 7.89394 | 2 | [0, 194] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_221124__379.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1975 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_221555__568 | 0 | 0.0 | 8.64047 | 0 | [0, 210] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240131_221555__568.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1976 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_221600__220 | 0 | 0.0 | 5.72665 | 0 | [0, 140] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240131_221600__220.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1977 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_221603__403 | 0 | 0.0 | 2.94469 | 0 | [0, 72] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240131_221603__403.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1978 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_221613__504 | 0 | 0.0 | 9.78971 | 0 | [0, 238] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240131_221613__504.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1979 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_221624__728 | 0 | 0.0 | 10.7898 | 0 | [0, 262] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240131_221624__728.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1980 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_221432__233 | 0 | 0.0 | 4.76884 | 0 | [0, 116] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240131_221432__233.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1981 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_221442__613 | 0 | 0.0 | 9.44674 | 0 | [0, 226] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240131_221442__613.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1982 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_221452__332 | 0 | 0.0 | 10.7218 | 0 | [0, 259] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240131_221452__332.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1983 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240131_221459__211 | 0 | 0.0 | 6.92723 | 0 | [0, 169] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240131_221459__211.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1984 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_221508__574 | 0 | 0.0 | 8.54045 | 0 | [0, 208] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240131_221508__574.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1985 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_220435__301 | 0 | 0.0 | 12.1854 | 0 | [0, 228] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_220435__301.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1986 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_220455__210 | 0 | 0.0 | 19.9322 | 0 | [0, 373] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_220455__210.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1987 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20240131_220507__427 | 0 | 0.0 | 12.4091 | 0 | [0, 232] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_220507__427.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1988 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240131_220522__104 | 0 | 0.0 | 14.8368 | 0 | [0, 278] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_220522__104.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1989 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20240131_220532__844 | 0 | 0.0 | 10.3265 | 0 | [0, 194] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_220532__844.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1990 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_220215__422 | 0 | 0.0 | 12.4364 | 0 | [0, 232] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_220215__422.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1991 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_220241__381 | 0 | 0.0 | 25.8339 | 0 | [0, 480] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_220241__381.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1992 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_220253__843 | 0 | 0.0 | 12.3402 | 0 | [0, 230] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_220253__843.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1993 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_220305__110 | 0 | 0.0 | 11.7786 | 0 | [0, 220] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_220305__110.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1994 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_220316__896 | 0 | 0.0 | 10.3932 | 0 | [0, 194] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_220316__896.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1995 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_220032__723 | 0 | 0.0 | 12.2289 | 0 | [0, 227] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_220032__723.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1996 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_220046__103 | 0 | 0.0 | 13.3315 | 0 | [0, 247] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_220046__103.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1997 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_220105__789 | 0 | 0.0 | 19.178 | 0 | [0, 355] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_220105__789.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1998 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_220113__486 | 0 | 0.0 | 8.29015 | 0 | [0, 154] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_220113__486.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1999 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240131_220120__477 | 0 | 0.0 | 6.91726 | 0 | [0, 129] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_220120__477.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2000 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_220856__345 | 0 | 0.0 | 15.9855 | 0 | [0, 295] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_220856__345.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2001 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_220913__440 | 0 | 0.0 | 16.564 | 0 | [0, 306] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_220913__440.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2002 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_220925__816 | 0 | 0.0 | 12.2291 | 0 | [0, 226] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_220925__816.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2003 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_220935__112 | 0 | 0.0 | 9.38071 | 0 | [0, 174] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_220935__112.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2004 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_220947__619 | 0 | 0.0 | 12.1034 | 0 | [0, 224] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_220947__619.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2005 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_220654__331 | 1 | 0.0 | 17.0021 | 2 | [0, 315] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_220654__331.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2006 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240131_220705__174 | 0 | 0.0 | 10.7562 | 0 | [0, 199] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_220705__174.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2007 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_220718__805 | 1 | 0.0 | 13.2379 | 2 | [0, 245] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_220718__805.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2008 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240131_220723__594 | 0 | 0.0 | 5.2096 | 0 | [0, 96] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_220723__594.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2009 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240131_220732__757 | 0 | 0.0 | 8.65878 | 0 | [0, 161] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_220732__757.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2010 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_221939__860 | 0 | 0.0 | 1.41635 | 0 | [0, 170] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_221939__860.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2011 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20240131_221942__489 | 0 | 0.0 | 2.38262 | 0 | [0, 291] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_221942__489.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2012 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_221944__329 | 0 | 0.0 | 2.00435 | 0 | [0, 246] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_221944__329.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2013 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_221945__482 | 0 | 0.0 | 1.28903 | 0 | [0, 158] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_221945__482.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2014 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_221947__237 | 0 | 0.0 | 2.25441 | 0 | [0, 274] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_221947__237.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2015 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_221926__261 | 0 | 0.0 | 0.233045 | 0 | [0, 28] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_221926__261.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2016 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_221926__596 | 0 | 0.0 | 0.232129 | 0 | [0, 28] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_221926__596.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2017 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_221926__819 | 0 | 0.0 | 0.232043 | 0 | [0, 28] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_221926__819.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2018 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_221927__177 | 0 | 0.0 | 0.233677 | 0 | [0, 28] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_221927__177.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2019 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_221927__997 | 0 | 0.0 | 0.299381 | 0 | [0, 36] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_221927__997.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2020 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_221921__123 | 0 | 0.0 | 0.722951 | 0 | [0, 87] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_221921__123.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2021 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_221921__225 | 0 | 0.0 | 0.513852 | 0 | [0, 62] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_221921__225.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2022 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_221922__421 | 0 | 0.0 | 0.720889 | 0 | [0, 87] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_221922__421.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2023 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_221923__435 | 0 | 0.0 | 0.934896 | 0 | [0, 113] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_221923__435.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2024 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_221924__533 | 0 | 0.0 | 0.662738 | 0 | [0, 80] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_221924__533.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2025 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_222038__842 | 0 | 0.0 | 2.74836 | 0 | [0, 314] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_222038__842.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2026 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_222040__660 | 1 | 0.0 | 2.35745 | 1 | [0, 270] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_222040__660.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2027 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_222042__112 | 0 | 0.0 | 1.64316 | 0 | [0, 189] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_222042__112.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2028 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_222044__988 | 0 | 0.0 | 1.90412 | 0 | [0, 219] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_222044__988.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2029 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_222048__172 | 1 | 0.0 | 4.03617 | 2 | [0, 459] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_222048__172.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2030 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_222003__904 | 1 | 0.0 | 1.78449 | 2 | [0, 209] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_222003__904.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2031 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_222005__802 | 0 | 0.0 | 2.46676 | 0 | [0, 292] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_222005__802.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2032 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_222008__305 | 0 | 0.0 | 2.57912 | 0 | [0, 307] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_222008__305.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2033 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_222011__788 | 0 | 0.0 | 2.54568 | 1 | [0, 304] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_222011__788.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2034 | NVIDIA-RTX-4090-4x | count_model_rows | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_222014__564 | 0 | 0.0 | 3.22972 | 0 | [0, 383] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_222014__564.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2035 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200214__347 | 1 | 0.000288 | 1.50312 | 2 | [54, 174] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200214__347.json | 80.0 | missing | missing | missing | |
| 2036 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200215__530 | 1 | 0.0002175 | 1.08927 | 2 | [54, 127] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200215__530.json | 80.0 | missing | missing | missing | |
| 2037 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200217__794 | 0 | 0.000306 | 1.35656 | 0 | [54, 186] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200217__794.json | 50.0 | missing | missing | missing | |
| 2038 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200218__455 | 1 | 0.000276 | 1.48263 | 2 | [54, 166] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200218__455.json | 80.0 | missing | missing | missing | |
| 2039 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200220__370 | 0 | 0.0002235 | 1.24175 | 0 | [54, 131] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200220__370.json | 50.0 | missing | missing | missing | |
| 2040 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200210__699 | 1 | 9.55e-5 | 0.663382 | 2 | [89, 34] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200210__699.json | 80.0 | missing | missing | missing | |
| 2041 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200211__485 | 1 | 9.7e-5 | 0.513976 | 2 | [89, 35] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200211__485.json | 80.0 | missing | missing | missing | |
| 2042 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200212__412 | 1 | 9.85e-5 | 0.455396 | 2 | [89, 36] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200212__412.json | 80.0 | missing | missing | missing | |
| 2043 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200212__818 | 1 | 9.25e-5 | 0.535998 | 2 | [89, 32] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200212__818.json | 80.0 | missing | missing | missing | |
| 2044 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240201_200213__851 | 0 | 0.0001075 | 0.574309 | 0 | [89, 42] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200213__851.json | 25.0 | missing | missing | missing | |
| 2045 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200206__874 | 5 | 0.0002035 | 0.702944 | 2 | [179, 76] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200206__874.json | 100.0 | missing | missing | missing | |
| 2046 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200207__872 | 1 | 0.000196 | 0.919204 | 2 | [179, 71] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200207__872.json | 80.0 | missing | missing | missing | |
| 2047 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200208__533 | 1 | 0.000196 | 0.883994 | 2 | [179, 71] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200208__533.json | 80.0 | missing | missing | missing | |
| 2048 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200209__334 | 1 | 0.000193 | 0.893183 | 2 | [179, 69] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200209__334.json | 80.0 | missing | missing | missing | |
| 2049 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200210__228 | 1 | 0.0001915 | 0.713106 | 2 | [179, 68] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200210__228.json | 80.0 | missing | missing | missing | |
| 2050 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_200225__343 | 1 | 0.0003405 | 1.20419 | 2 | [312, 123] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200225__343.json | 80.0 | missing | missing | missing | |
| 2051 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_200226__491 | 5 | 0.000225 | 0.659001 | 2 | [312, 46] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200226__491.json | 100.0 | missing | missing | missing | |
| 2052 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200227__262 | 0 | 0.0002685 | 0.772867 | 0 | [312, 75] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200227__262.json | 0.0 | missing | missing | missing | |
| 2053 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_200228__200 | 1 | 0.0003735 | 1.33581 | 2 | [312, 145] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200228__200.json | 80.0 | missing | missing | missing | |
| 2054 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_200229__656 | 1 | 0.0002235 | 0.574172 | 2 | [312, 45] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200229__656.json | 80.0 | missing | missing | missing | |
| 2055 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200221__790 | 1 | 0.00022 | 0.756991 | 2 | [311, 43] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200221__790.json | 80.0 | missing | missing | missing | |
| 2056 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200221__808 | 1 | 0.00022 | 0.570766 | 2 | [311, 43] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200221__808.json | 80.0 | missing | missing | missing | |
| 2057 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200222__843 | 1 | 0.000214 | 0.750536 | 2 | [311, 39] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200222__843.json | 80.0 | missing | missing | missing | |
| 2058 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200223__135 | 1 | 0.000301 | 1.03729 | 2 | [311, 97] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200223__135.json | 80.0 | missing | missing | missing | |
| 2059 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200224__595 | 1 | 0.000316 | 0.946077 | 2 | [311, 107] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200224__595.json | 80.0 | missing | missing | missing | |
| 2060 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_072834__984 | 1 | 0.01716 | 40.0043 | 2 | [54, 554] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_072834__984.json | 80.0 | missing | missing | missing | |
| 2061 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_072901__737 | 5 | 0.01398 | 26.4306 | 2 | [54, 448] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_072901__737.json | 100.0 | missing | missing | missing | |
| 2062 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_072932__235 | 5 | 0.01377 | 31.143 | 2 | [54, 441] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_072932__235.json | 100.0 | missing | missing | missing | |
| 2063 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_073001__600 | 5 | 0.01329 | 29.1029 | 2 | [54, 425] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_073001__600.json | 100.0 | missing | missing | missing | |
| 2064 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_073030__488 | 5 | 0.01221 | 29.0031 | 2 | [54, 389] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_073030__488.json | 100.0 | missing | missing | missing | |
| 2065 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_072451__229 | 5 | 0.00206 | 3.58215 | 2 | [89, 39] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_072451__229.json | 100.0 | missing | missing | missing | |
| 2066 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_072454__944 | 5 | 0.00218 | 3.40121 | 2 | [89, 43] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_072454__944.json | 100.0 | missing | missing | missing | |
| 2067 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_072459__758 | 5 | 0.00224 | 4.08267 | 2 | [89, 45] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_072459__758.json | 100.0 | missing | missing | missing | |
| 2068 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_072502__882 | 5 | 0.00206 | 3.48979 | 2 | [89, 39] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_072502__882.json | 100.0 | missing | missing | missing | |
| 2069 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_072506__557 | 5 | 0.00245 | 3.76887 | 2 | [89, 52] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_072506__557.json | 100.0 | missing | missing | missing | |
| 2070 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_072323__110 | 5 | 0.00998 | 21.2141 | 2 | [179, 273] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_072323__110.json | 100.0 | missing | missing | missing | |
| 2071 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_072332__871 | 5 | 0.00551 | 8.90089 | 2 | [179, 124] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_072332__871.json | 100.0 | missing | missing | missing | |
| 2072 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_072345__649 | 5 | 0.00509 | 12.3627 | 2 | [179, 110] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_072345__649.json | 100.0 | missing | missing | missing | |
| 2073 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_072406__501 | 5 | 0.01133 | 20.8332 | 2 | [179, 318] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_072406__501.json | 100.0 | missing | missing | missing | |
| 2074 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_072431__562 | 5 | 0.01118 | 25.045 | 2 | [179, 313] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_072431__562.json | 100.0 | missing | missing | missing | |
| 2075 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_074310__472 | 5 | 0.02226 | 52.7121 | 2 | [312, 638] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_074310__472.json | 100.0 | missing | missing | missing | |
| 2076 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_074341__425 | 5 | 0.01479 | 30.5934 | 2 | [312, 389] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_074341__425.json | 100.0 | missing | missing | missing | |
| 2077 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_074414__599 | 5 | 0.01644 | 33.2138 | 2 | [312, 444] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_074414__599.json | 100.0 | missing | missing | missing | |
| 2078 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_074453__366 | 5 | 0.01833 | 39.2995 | 2 | [312, 507] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_074453__366.json | 100.0 | missing | missing | missing | |
| 2079 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_074523__177 | 5 | 0.01605 | 29.4677 | 2 | [312, 431] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_074523__177.json | 100.0 | missing | missing | missing | |
| 2080 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_073645__871 | 5 | 0.01925 | 44.1666 | 2 | [311, 538] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_073645__871.json | 100.0 | missing | missing | missing | |
| 2081 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_073721__237 | 5 | 0.01796 | 35.6364 | 2 | [311, 495] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_073721__237.json | 100.0 | missing | missing | missing | |
| 2082 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_073752__218 | 5 | 0.01229 | 30.9352 | 2 | [311, 306] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_073752__218.json | 100.0 | missing | missing | missing | |
| 2083 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_073838__790 | 1 | 0.02219 | 46.3723 | 2 | [311, 636] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_073838__790.json | 80.0 | missing | missing | missing | |
| 2084 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_073914__526 | 5 | 0.01892 | 35.3469 | 2 | [311, 527] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/count_model_rows/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_073914__526.json | 100.0 | missing | missing | missing | |
| 2085 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_214900__134 | 0 | 0.0 | 11.1217 | 0 | [44, 340] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_214900__134.json | 0.0 | missing | missing | missing | |
| 2086 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_214914__567 | 0 | 0.0 | 14.8807 | 0 | [1, 460] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_214914__567.json | 0.0 | missing | missing | missing | |
| 2087 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_214928__509 | 0 | 0.0 | 13.7121 | 0 | [1, 426] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_214928__509.json | 0.0 | missing | missing | missing | |
| 2088 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_002605__237 | 0 | 0.0 | 49.2977 | 0 | [57, 300] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_002605__237.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2089 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_002703__260 | 0 | 0.0 | 58.3766 | 0 | [57, 356] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_002703__260.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2090 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_214833__412 | 0 | 0.0 | 11.0637 | 0 | [1, 348] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_214833__412.json | 0.0 | missing | missing | missing | |
| 2091 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_214848__877 | 0 | 0.0 | 15.1402 | 0 | [1, 467] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_214848__877.json | 0.0 | missing | missing | missing | |
| 2092 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_002425__452 | 1 | 0.0 | 60.5936 | 2 | [60, 370] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_002425__452.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2093 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_002515__774 | 0 | 0.0 | 49.5821 | 0 | [60, 302] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_002515__774.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2094 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231226_221124__986 | 5 | 0.0 | 27.0139 | 2 | [60, 162] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231226_221124__986.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2095 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_214807__672 | 0 | 0.0 | 7.94208 | 0 | [1, 251] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_214807__672.json | 50.0 | missing | missing | missing | |
| 2096 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_214811__771 | 0 | 0.0 | 4.51061 | 0 | [1, 145] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_214811__771.json | 50.0 | missing | missing | missing | |
| 2097 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_002233__917 | 5 | 0.0 | 25.7016 | 2 | [101, 143] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_002233__917.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2098 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_002324__868 | 0 | 0.0 | 50.9779 | 0 | [101, 301] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_002324__868.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2099 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_221057__851 | 0 | 0.0 | 22.7213 | 0 | [101, 125] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_221057__851.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2100 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231219_214737__548 | 0 | 0.0 | 13.3937 | 0 | [1, 397] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_214737__548.json | 50.0 | missing | missing | missing | |
| 2101 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_214750__193 | 0 | 0.0 | 12.8179 | 0 | [1, 381] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_214750__193.json | 0.0 | missing | missing | missing | |
| 2102 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_002125__533 | 1 | 0.0 | 64.2227 | 2 | [213, 193] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_002125__533.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2103 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_002207__794 | 0 | 0.0 | 41.3104 | 0 | [213, 223] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_002207__794.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2104 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_221034__277 | 5 | 0.0 | 73.8019 | 2 | [213, 276] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_221034__277.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2105 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_215104__861 | 0 | 0.0 | 14.61 | 0 | [1, 411] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_215104__861.json | 50.0 | missing | missing | missing | |
| 2106 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_215117__608 | 0 | 0.0 | 12.8198 | 0 | [1, 363] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_215117__608.json | 25.0 | missing | missing | missing | |
| 2107 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_003016__320 | 1 | 0.0 | 69.3058 | 2 | [389, 361] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_003016__320.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2108 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_003052__750 | 0 | 0.0 | 35.7267 | 0 | [389, 159] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_003052__750.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2109 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231226_221302__557 | 0 | 0.0 | 33.0222 | 0 | [389, 143] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_221302__557.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2110 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_215014__120 | 0 | 0.0 | 21.3964 | 0 | [1, 586] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_215014__120.json | 50.0 | missing | missing | missing | |
| 2111 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_215033__129 | 0 | 0.0 | 19.2519 | 0 | [1, 532] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_215033__129.json | 0.0 | missing | missing | missing | |
| 2112 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_002808__553 | 5 | 0.0 | 64.2909 | 2 | [387, 331] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_002808__553.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2113 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_002906__811 | 5 | 0.0 | 57.9343 | 2 | [387, 293] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_002906__811.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2114 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_221229__169 | 5 | 0.0 | 64.4366 | 2 | [387, 333] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_221229__169.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2115 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 5 | 20231226_221918__154 | 0 | 0.0 | 9.5799 | 0 | [60, 373] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231226_221918__154.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2116 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_102959__396 | 0 | 0.0 | 9.22674 | 0 | [60, 357] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_102959__396.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2117 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_103007__185 | 0 | 0.0 | 7.45885 | 0 | [60, 288] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_103007__185.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2118 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_103016__930 | 0 | 0.0 | 8.73325 | 0 | [60, 339] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_103016__930.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2119 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_221908__410 | 0 | 0.0 | 7.92727 | 0 | [97, 301] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_221908__410.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2120 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_102939__892 | 0 | 0.0 | 6.06691 | 0 | [97, 228] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_102939__892.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2121 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_102945__552 | 0 | 0.0 | 6.22573 | 0 | [97, 233] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_102945__552.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2122 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_102950__993 | 0 | 0.0 | 4.74458 | 0 | [97, 174] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_102950__993.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2123 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_221900__726 | 0 | 0.0 | 8.98386 | 0 | [205, 196] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_221900__726.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2124 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_102924__102 | 0 | 0.0 | 7.11626 | 0 | [205, 126] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_102924__102.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2125 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_102927__358 | 0 | 0.0 | 2.96017 | 0 | [205, 91] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_102927__358.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2126 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_102933__484 | 0 | 0.0 | 5.47301 | 0 | [205, 187] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_102933__484.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2127 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_221935__966 | 0 | 0.0 | 8.34377 | 0 | [349, 273] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_221935__966.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2128 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_103050__519 | 0 | 0.0 | 8.17975 | 0 | [349, 261] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_103050__519.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2129 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_103103__757 | 0 | 0.0 | 12.0694 | 0 | [349, 406] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_103103__757.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2130 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_103112__612 | 0 | 0.0 | 9.86214 | 0 | [349, 326] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_103112__612.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2131 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_221927__714 | 0 | 0.0 | 8.80065 | 0 | [346, 290] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_221927__714.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2132 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_103027__938 | 0 | 0.0 | 11.6535 | 0 | [346, 389] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_103027__938.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2133 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_103033__230 | 0 | 0.0 | 6.06742 | 0 | [346, 188] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_103033__230.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2134 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_103042__592 | 1 | 0.0 | 8.8458 | 2 | [346, 289] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_103042__592.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2135 | Apple-MacBook-Pro-M1 | count_model_rows | gemini-1.0-pro-latest | InJulia | 1SHOT | true | false | 5 | 20240217_104805__551 | 0 | 0.0 | 2.35144 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_104805__551.json | 25.0 | missing | missing | missing | |
| 2136 | Apple-MacBook-Pro-M1 | count_model_rows | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_104808__393 | 0 | 0.0 | 2.94381 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_104808__393.json | 50.0 | missing | missing | missing | |
| 2137 | Apple-MacBook-Pro-M1 | count_model_rows | gemini-1.0-pro-latest | InJulia | 1SHOT | true | false | 5 | 20240217_104817__701 | 0 | 0.0 | 1.90053 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_104817__701.json | 25.0 | missing | missing | missing | |
| 2138 | Apple-MacBook-Pro-M1 | count_model_rows | gemini-1.0-pro-latest | InJulia | 1SHOT | false | false | 5 | 20240217_104820__969 | 0 | 0.0 | 2.91344 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_104820__969.json | 0.0 | missing | missing | missing | |
| 2139 | Apple-MacBook-Pro-M1 | count_model_rows | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_113813__416 | 0 | 0.0 | 1.89602 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_113813__416.json | 50.0 | missing | missing | missing | |
| 2140 | Apple-MacBook-Pro-M1 | count_model_rows | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_104733__578 | 0 | 0.0 | 5.55602 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_104733__578.json | 50.0 | missing | missing | missing | |
| 2141 | Apple-MacBook-Pro-M1 | count_model_rows | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240217_104737__758 | 0 | 0.0 | 3.46374 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_104737__758.json | 0.0 | missing | missing | missing | |
| 2142 | Apple-MacBook-Pro-M1 | count_model_rows | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_104739__776 | 0 | 0.0 | 2.3441 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_104739__776.json | 50.0 | missing | missing | missing | |
| 2143 | Apple-MacBook-Pro-M1 | count_model_rows | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_104744__283 | 0 | 0.0 | 4.67293 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_104744__283.json | 50.0 | missing | missing | missing | |
| 2144 | Apple-MacBook-Pro-M1 | count_model_rows | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_104747__473 | 0 | 0.0 | 2.8842 | 1 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_104747__473.json | 62.5 | missing | missing | missing | |
| 2145 | Apple-MacBook-Pro-M1 | count_model_rows | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240217_104640__335 | 0 | 0.0 | 2.26715 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_104640__335.json | 25.0 | missing | missing | missing | |
| 2146 | Apple-MacBook-Pro-M1 | count_model_rows | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240217_104642__310 | 0 | 0.0 | 2.48906 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_104642__310.json | 25.0 | missing | missing | missing | |
| 2147 | Apple-MacBook-Pro-M1 | count_model_rows | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240217_104645__366 | 0 | 0.0 | 2.70059 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_104645__366.json | 50.0 | missing | missing | missing | |
| 2148 | Apple-MacBook-Pro-M1 | count_model_rows | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240217_104652__774 | 0 | 0.0 | 6.57323 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_104652__774.json | 25.0 | missing | missing | missing | |
| 2149 | Apple-MacBook-Pro-M1 | count_model_rows | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240217_104654__139 | 0 | 0.0 | 2.29829 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_104654__139.json | 25.0 | missing | missing | missing | |
| 2150 | Apple-MacBook-Pro-M1 | count_model_rows | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240217_104944__947 | 0 | 0.0 | 6.5178 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_104944__947.json | 50.0 | missing | missing | missing | |
| 2151 | Apple-MacBook-Pro-M1 | count_model_rows | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240217_104946__721 | 0 | 0.0 | 2.68674 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_104946__721.json | 25.0 | missing | missing | missing | |
| 2152 | Apple-MacBook-Pro-M1 | count_model_rows | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240217_104949__871 | 0 | 0.0 | 2.13397 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_104949__871.json | 50.0 | missing | missing | missing | |
| 2153 | Apple-MacBook-Pro-M1 | count_model_rows | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240217_104951__146 | 0 | 0.0 | 2.45356 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_104951__146.json | 50.0 | missing | missing | missing | |
| 2154 | Apple-MacBook-Pro-M1 | count_model_rows | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240217_104954__275 | 0 | 0.0 | 3.30772 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_104954__275.json | 0.0 | missing | missing | missing | |
| 2155 | Apple-MacBook-Pro-M1 | count_model_rows | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20240217_104844__883 | 0 | 0.0 | 1.97379 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_104844__883.json | 0.0 | missing | missing | missing | |
| 2156 | Apple-MacBook-Pro-M1 | count_model_rows | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_104850__360 | 0 | 0.0 | 5.20158 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_104850__360.json | 50.0 | missing | missing | missing | |
| 2157 | Apple-MacBook-Pro-M1 | count_model_rows | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_104853__999 | 0 | 0.0 | 3.40411 | 1 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_104853__999.json | 62.5 | missing | missing | missing | |
| 2158 | Apple-MacBook-Pro-M1 | count_model_rows | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20240217_104903__514 | 0 | 0.0 | 9.57522 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_104903__514.json | 0.0 | missing | missing | missing | |
| 2159 | Apple-MacBook-Pro-M1 | count_model_rows | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20240217_104907__212 | 0 | 0.0 | 4.24761 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_104907__212.json | 0.0 | missing | missing | missing | |
| 2160 | Apple-MacBook-Pro-M1 | count_model_rows | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | false | 5 | 20240223_215122__207 | 0 | 0.0 | 9.08964 | 0 | [0, 141] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_215122__207.json | 25.0 | missing | missing | missing | |
| 2161 | Apple-MacBook-Pro-M1 | count_model_rows | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | false | 5 | 20240223_215141__707 | 0 | 0.0 | 19.1426 | 0 | [0, 300] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_215141__707.json | 25.0 | missing | missing | missing | |
| 2162 | Apple-MacBook-Pro-M1 | count_model_rows | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | false | 5 | 20240223_215155__258 | 0 | 0.0 | 14.0623 | 0 | [0, 217] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_215155__258.json | 25.0 | missing | missing | missing | |
| 2163 | Apple-MacBook-Pro-M1 | count_model_rows | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | false | 5 | 20240223_215210__459 | 0 | 0.0 | 15.0667 | 0 | [0, 236] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_215210__459.json | 25.0 | missing | missing | missing | |
| 2164 | Apple-MacBook-Pro-M1 | count_model_rows | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | false | 5 | 20240223_215227__713 | 0 | 0.0 | 16.4914 | 0 | [0, 259] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_215227__713.json | 25.0 | missing | missing | missing | |
| 2165 | Apple-MacBook-Pro-M1 | count_model_rows | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240223_214940__397 | 0 | 0.0 | 2.06947 | 0 | [0, 30] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_214940__397.json | 0.0 | missing | missing | missing | |
| 2166 | Apple-MacBook-Pro-M1 | count_model_rows | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240223_214942__733 | 0 | 0.0 | 1.99825 | 0 | [0, 30] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_214942__733.json | 0.0 | missing | missing | missing | |
| 2167 | Apple-MacBook-Pro-M1 | count_model_rows | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240223_214944__268 | 0 | 0.0 | 1.93201 | 0 | [0, 30] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_214944__268.json | 0.0 | missing | missing | missing | |
| 2168 | Apple-MacBook-Pro-M1 | count_model_rows | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240223_214946__117 | 0 | 0.0 | 1.95106 | 0 | [0, 30] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_214946__117.json | 0.0 | missing | missing | missing | |
| 2169 | Apple-MacBook-Pro-M1 | count_model_rows | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240223_214948__738 | 0 | 0.0 | 1.95469 | 0 | [0, 30] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_214948__738.json | 0.0 | missing | missing | missing | |
| 2170 | Apple-MacBook-Pro-M1 | count_model_rows | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240223_214801__701 | 0 | 0.0 | 19.607 | 0 | [0, 305] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_214801__701.json | 0.0 | missing | missing | missing | |
| 2171 | Apple-MacBook-Pro-M1 | count_model_rows | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240223_214824__285 | 0 | 0.0 | 22.0498 | 0 | [0, 345] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_214824__285.json | 0.0 | missing | missing | missing | |
| 2172 | Apple-MacBook-Pro-M1 | count_model_rows | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240223_214850__422 | 0 | 0.0 | 26.8613 | 0 | [0, 416] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_214850__422.json | 0.0 | missing | missing | missing | |
| 2173 | Apple-MacBook-Pro-M1 | count_model_rows | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240223_214908__251 | 0 | 0.0 | 17.4164 | 0 | [0, 271] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_214908__251.json | 25.0 | missing | missing | missing | |
| 2174 | Apple-MacBook-Pro-M1 | count_model_rows | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240223_214928__801 | 0 | 0.0 | 19.9047 | 0 | [0, 311] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_214928__801.json | 25.0 | missing | missing | missing | |
| 2175 | Apple-MacBook-Pro-M1 | count_model_rows | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_215748__595 | 0 | 0.0 | 19.5615 | 0 | [0, 299] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_215748__595.json | 50.0 | missing | missing | missing | |
| 2176 | Apple-MacBook-Pro-M1 | count_model_rows | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_215818__345 | 0 | 0.0 | 29.4452 | 0 | [0, 448] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_215818__345.json | 50.0 | missing | missing | missing | |
| 2177 | Apple-MacBook-Pro-M1 | count_model_rows | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240223_215841__905 | 0 | 0.0 | 23.1983 | 0 | [0, 358] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_215841__905.json | 25.0 | missing | missing | missing | |
| 2178 | Apple-MacBook-Pro-M1 | count_model_rows | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240223_215905__623 | 0 | 0.0 | 24.5752 | 0 | [0, 371] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_215905__623.json | 0.0 | missing | missing | missing | |
| 2179 | Apple-MacBook-Pro-M1 | count_model_rows | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240223_215925__992 | 0 | 0.0 | 19.5648 | 0 | [0, 297] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_215925__992.json | 0.0 | missing | missing | missing | |
| 2180 | Apple-MacBook-Pro-M1 | count_model_rows | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240223_215432__348 | 0 | 0.0 | 20.629 | 0 | [0, 318] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_215432__348.json | 0.0 | missing | missing | missing | |
| 2181 | Apple-MacBook-Pro-M1 | count_model_rows | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20240223_215456__413 | 0 | 0.0 | 24.2797 | 0 | [0, 375] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_215456__413.json | 25.0 | missing | missing | missing | |
| 2182 | Apple-MacBook-Pro-M1 | count_model_rows | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240223_215518__612 | 0 | 0.0 | 22.1601 | 1 | [0, 344] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_215518__612.json | 62.5 | missing | missing | missing | |
| 2183 | Apple-MacBook-Pro-M1 | count_model_rows | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240223_215539__735 | 0 | 0.0 | 20.9359 | 0 | [0, 322] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_215539__735.json | 50.0 | missing | missing | missing | |
| 2184 | Apple-MacBook-Pro-M1 | count_model_rows | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20240223_215555__264 | 0 | 0.0 | 16.1686 | 0 | [0, 250] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_215555__264.json | 25.0 | missing | missing | missing | |
| 2185 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 5 | 20231213_212857__164 | 0 | 0.000216 | 3.37219 | 0 | [51, 127] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231213_212857__164.json | 0.0 | missing | missing | missing | |
| 2186 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 5 | 20231225_184003__705 | 0 | 0.000288 | 3.39004 | 0 | [51, 175] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_184003__705.json | 0.0 | missing | missing | missing | |
| 2187 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 5 | 20231225_184005__365 | 0 | 0.0001725 | 2.15569 | 0 | [51, 98] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_184005__365.json | 0.0 | missing | missing | missing | |
| 2188 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo--optim | AsIs | 1SHOT | false | false | 5 | 20231215_190622__369 | 0 | 0.0 | 4.5524 | 0 | [51, 175] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231215_190622__369.json | 0.0 | 0.5 | missing | 0.5 | |
| 2189 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231213_212854__632 | 0 | 0.000411 | 5.41037 | 0 | [54, 256] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231213_212854__632.json | 50.0 | missing | missing | missing | |
| 2190 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231225_183955__152 | 0 | 0.0003825 | 4.00631 | 0 | [54, 237] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_183955__152.json | 50.0 | missing | missing | missing | |
| 2191 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231225_184000__658 | 0 | 0.00039 | 4.72808 | 0 | [54, 242] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_184000__658.json | 50.0 | missing | missing | missing | |
| 2192 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231227_191244__586 | 0 | 0.000456 | 5.23862 | 2 | [54, 286] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_191244__586.json | 75.0 | missing | missing | missing | |
| 2193 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231227_191246__741 | 0 | 0.000183 | 2.20028 | 0 | [54, 104] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_191246__741.json | 50.0 | missing | missing | missing | |
| 2194 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo--optim | InJulia | 1SHOT | true | true | 5 | 20231215_190618__651 | 5 | 0.0 | 4.90573 | 2 | [54, 195] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231215_190618__651.json | 100.0 | 0.5 | missing | 0.5 | |
| 2195 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_212848__157 | 1 | 0.0001015 | 1.16132 | 2 | [89, 38] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231213_212848__157.json | 80.0 | missing | missing | missing | |
| 2196 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_183950__205 | 0 | 9.55e-5 | 1.13909 | 0 | [89, 34] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_183950__205.json | 50.0 | missing | missing | missing | |
| 2197 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_183951__187 | 0 | 9.25e-5 | 0.829683 | 0 | [89, 32] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_183951__187.json | 50.0 | missing | missing | missing | |
| 2198 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_191237__925 | 1 | 9.85e-5 | 1.27729 | 2 | [89, 36] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_191237__925.json | 80.0 | missing | missing | missing | |
| 2199 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_191239__101 | 0 | 9.55e-5 | 1.52188 | 0 | [89, 34] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_191239__101.json | 50.0 | missing | missing | missing | |
| 2200 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_190613__981 | 0 | 0.0 | 1.45723 | 0 | [89, 40] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231215_190613__981.json | 50.0 | 0.5 | missing | 0.5 | |
| 2201 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_212847__696 | 0 | 0.0005095 | 6.22961 | 0 | [179, 280] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231213_212847__696.json | 50.0 | missing | missing | missing | |
| 2202 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_183944__993 | 0 | 0.0004435 | 3.79363 | 0 | [179, 236] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_183944__993.json | 50.0 | missing | missing | missing | |
| 2203 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_183949__239 | 0 | 0.0003625 | 3.72747 | 0 | [179, 182] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_183949__239.json | 50.0 | missing | missing | missing | |
| 2204 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_191232__539 | 0 | 0.0002125 | 2.21807 | 0 | [179, 82] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_191232__539.json | 0.0 | missing | missing | missing | |
| 2205 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_191236__579 | 0 | 0.0003475 | 3.53578 | 0 | [179, 172] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_191236__579.json | 25.0 | missing | missing | missing | |
| 2206 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo--optim | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231215_190611__456 | 0 | 0.0 | 2.46023 | 0 | [179, 70] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231215_190611__456.json | 0.0 | 0.5 | missing | 0.5 | |
| 2207 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_212907__111 | 0 | 0.0004875 | 4.37759 | 0 | [312, 221] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231213_212907__111.json | 50.0 | missing | missing | missing | |
| 2208 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_184018__734 | 1 | 0.000573 | 4.72011 | 2 | [312, 278] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_184018__734.json | 80.0 | missing | missing | missing | |
| 2209 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_184024__900 | 0 | 0.000696 | 5.57646 | 0 | [312, 360] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_184024__900.json | 50.0 | missing | missing | missing | |
| 2210 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_191303__752 | 1 | 0.0006435 | 5.44536 | 2 | [312, 325] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_191303__752.json | 80.0 | missing | missing | missing | |
| 2211 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_191310__747 | 0 | 0.000645 | 6.0915 | 0 | [312, 326] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_191310__747.json | 50.0 | missing | missing | missing | |
| 2212 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_190638__902 | 1 | 0.0 | 7.9809 | 2 | [312, 331] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231215_190638__902.json | 80.0 | 0.5 | missing | 0.5 | |
| 2213 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_212902__795 | 0 | 0.0005425 | 4.85315 | 0 | [311, 258] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231213_212902__795.json | 50.0 | missing | missing | missing | |
| 2214 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_184008__659 | 0 | 0.0003805 | 2.78316 | 0 | [311, 150] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_184008__659.json | 50.0 | missing | missing | missing | |
| 2215 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_184013__399 | 0 | 0.0005665 | 4.8462 | 0 | [311, 274] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_184013__399.json | 50.0 | missing | missing | missing | |
| 2216 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_191252__235 | 0 | 0.000619 | 6.02395 | 0 | [311, 309] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_191252__235.json | 50.0 | missing | missing | missing | |
| 2217 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_191258__326 | 0 | 0.000631 | 5.46853 | 0 | [311, 317] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_191258__326.json | 50.0 | missing | missing | missing | |
| 2218 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_190630__908 | 5 | 0.0 | 7.62478 | 2 | [311, 331] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231215_190630__908.json | 100.0 | 0.5 | missing | 0.5 | |
| 2219 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 5 | 20231213_212917__691 | 0 | 0.000449 | 3.59438 | 0 | [51, 199] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231213_212917__691.json | 0.0 | missing | missing | missing | |
| 2220 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 5 | 20231225_184035__195 | 0 | 0.000349 | 2.11582 | 0 | [51, 149] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_184035__195.json | 0.0 | missing | missing | missing | |
| 2221 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 5 | 20231225_184037__819 | 0 | 0.000361 | 2.04548 | 0 | [51, 155] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_184037__819.json | 0.0 | missing | missing | missing | |
| 2222 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106--optim | AsIs | 1SHOT | false | false | 5 | 20231215_190650__414 | 0 | 0.0 | 3.71112 | 0 | [51, 137] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231215_190650__414.json | 0.0 | 0.9 | missing | 0.1 | |
| 2223 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | false | 5 | 20231213_212914__142 | 0 | 0.000474 | 3.71138 | 0 | [54, 210] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231213_212914__142.json | 25.0 | missing | missing | missing | |
| 2224 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231225_184031__747 | 1 | 0.000264 | 1.60518 | 2 | [54, 105] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_184031__747.json | 80.0 | missing | missing | missing | |
| 2225 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231225_184033__881 | 0 | 0.000408 | 2.23107 | 0 | [54, 177] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_184033__881.json | 50.0 | missing | missing | missing | |
| 2226 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231227_191317__930 | 1 | 0.0003 | 2.157 | 2 | [54, 123] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_191317__930.json | 80.0 | missing | missing | missing | |
| 2227 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | false | 5 | 20231227_191320__295 | 0 | 0.00044 | 3.28609 | 0 | [54, 193] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_191320__295.json | 25.0 | missing | missing | missing | |
| 2228 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106--optim | InJulia | 1SHOT | true | true | 5 | 20231215_190646__135 | 1 | 0.0 | 4.19871 | 2 | [54, 174] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231215_190646__135.json | 80.0 | 0.9 | missing | 0.1 | |
| 2229 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_212910__395 | 1 | 0.000157 | 0.798098 | 2 | [89, 34] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231213_212910__395.json | 80.0 | missing | missing | missing | |
| 2230 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_184028__234 | 1 | 0.000161 | 0.900967 | 2 | [89, 36] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_184028__234.json | 80.0 | missing | missing | missing | |
| 2231 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_184029__262 | 1 | 0.000157 | 1.23721 | 2 | [89, 34] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_184029__262.json | 80.0 | missing | missing | missing | |
| 2232 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_191314__672 | 1 | 0.000157 | 1.27145 | 2 | [89, 34] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_191314__672.json | 80.0 | missing | missing | missing | |
| 2233 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_191315__305 | 5 | 0.000163 | 1.03561 | 2 | [89, 37] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_191315__305.json | 100.0 | missing | missing | missing | |
| 2234 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_190642__438 | 1 | 0.0 | 1.88337 | 2 | [89, 34] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231215_190642__438.json | 80.0 | 0.9 | missing | 0.1 | |
| 2235 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_212909__220 | 5 | 0.000311 | 2.28184 | 2 | [179, 66] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231213_212909__220.json | 100.0 | missing | missing | missing | |
| 2236 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_184025__683 | 0 | 0.000319 | 1.24153 | 0 | [179, 70] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_184025__683.json | 0.0 | missing | missing | missing | |
| 2237 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_184027__176 | 1 | 0.000337 | 1.49618 | 2 | [179, 79] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_184027__176.json | 80.0 | missing | missing | missing | |
| 2238 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_191311__187 | 0 | 0.000317 | 1.778 | 0 | [179, 69] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_191311__187.json | 25.0 | missing | missing | missing | |
| 2239 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_191312__959 | 1 | 0.000249 | 0.988293 | 2 | [179, 35] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_191312__959.json | 80.0 | missing | missing | missing | |
| 2240 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_190640__415 | 1 | 0.0 | 1.35808 | 2 | [179, 63] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231215_190640__415.json | 80.0 | 0.9 | missing | 0.1 | |
| 2241 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231213_212921__202 | 0 | 0.000534 | 2.1084 | 0 | [312, 111] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231213_212921__202.json | 0.0 | missing | missing | missing | |
| 2242 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_184041__637 | 0 | 0.000552 | 1.66926 | 0 | [312, 120] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_184041__637.json | 0.0 | missing | missing | missing | |
| 2243 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_184042__276 | 0 | 0.000434 | 1.05497 | 0 | [312, 61] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_184042__276.json | 0.0 | missing | missing | missing | |
| 2244 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_191326__662 | 0 | 0.000494 | 2.23592 | 0 | [312, 91] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_191326__662.json | 0.0 | missing | missing | missing | |
| 2245 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_191328__139 | 0 | 0.000442 | 1.91437 | 0 | [312, 65] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_191328__139.json | 0.0 | missing | missing | missing | |
| 2246 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_190658__612 | 1 | 0.0 | 6.69083 | 2 | [312, 177] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231215_190658__612.json | 80.0 | 0.9 | missing | 0.1 | |
| 2247 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_212919__169 | 1 | 0.000397 | 1.33596 | 2 | [311, 43] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231213_212919__169.json | 80.0 | missing | missing | missing | |
| 2248 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_184038__613 | 5 | 0.000403 | 1.06251 | 2 | [311, 46] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_184038__613.json | 100.0 | missing | missing | missing | |
| 2249 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_184039__151 | 1 | 0.000409 | 1.02854 | 2 | [311, 49] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_184039__151.json | 80.0 | missing | missing | missing | |
| 2250 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_191322__384 | 5 | 0.000501 | 1.81289 | 2 | [311, 95] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_191322__384.json | 100.0 | missing | missing | missing | |
| 2251 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_191323__571 | 1 | 0.000389 | 1.03813 | 2 | [311, 39] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_191323__571.json | 80.0 | missing | missing | missing | |
| 2252 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-3.5-turbo-1106--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_190651__922 | 1 | 0.0 | 1.39715 | 2 | [311, 43] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231215_190651__922.json | 80.0 | 0.9 | missing | 0.1 | |
| 2253 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 5 | 20231213_213101__895 | 0 | 0.01575 | 41.9263 | 0 | [51, 508] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231213_213101__895.json | 0.0 | missing | missing | missing | |
| 2254 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 5 | 20231225_184220__346 | 0 | 0.00957 | 22.8122 | 0 | [51, 302] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_184220__346.json | 0.0 | missing | missing | missing | |
| 2255 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 5 | 20231225_184232__349 | 0 | 0.0093 | 11.5727 | 0 | [51, 293] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_184232__349.json | 0.0 | missing | missing | missing | |
| 2256 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview--optim | AsIs | 1SHOT | false | false | 5 | 20231215_190838__257 | 0 | 0.0 | 30.66 | 0 | [51, 241] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231215_190838__257.json | 0.0 | 0.1 | missing | 0.9 | |
| 2257 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231213_213019__826 | 5 | 0.01212 | 23.4811 | 2 | [54, 386] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231213_213019__826.json | 100.0 | missing | missing | missing | |
| 2258 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231225_184140__687 | 5 | 0.01161 | 20.7062 | 2 | [54, 369] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_184140__687.json | 100.0 | missing | missing | missing | |
| 2259 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231225_184157__589 | 5 | 0.01233 | 16.4134 | 2 | [54, 393] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_184157__589.json | 100.0 | missing | missing | missing | |
| 2260 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231227_191440__887 | 5 | 0.01167 | 21.7105 | 2 | [54, 371] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_191440__887.json | 100.0 | missing | missing | missing | |
| 2261 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231227_191512__967 | 5 | 0.012 | 32.3911 | 2 | [54, 382] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_191512__967.json | 100.0 | missing | missing | missing | |
| 2262 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview--optim | InJulia | 1SHOT | true | true | 5 | 20231215_190807__165 | 5 | 0.0 | 35.2113 | 2 | [54, 444] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231215_190807__165.json | 100.0 | 0.1 | missing | 0.9 | |
| 2263 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_212956__262 | 5 | 0.00428 | 9.23454 | 2 | [89, 113] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231213_212956__262.json | 100.0 | missing | missing | missing | |
| 2264 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_184110__349 | 5 | 0.00425 | 7.30659 | 2 | [89, 112] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_184110__349.json | 100.0 | missing | missing | missing | |
| 2265 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_184119__948 | 5 | 0.00614 | 9.85492 | 2 | [89, 175] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_184119__948.json | 100.0 | missing | missing | missing | |
| 2266 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_191411__738 | 5 | 0.00704 | 11.695 | 2 | [89, 205] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_191411__738.json | 100.0 | missing | missing | missing | |
| 2267 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_191418__453 | 5 | 0.00425 | 6.26356 | 2 | [89, 112] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_191418__453.json | 100.0 | missing | missing | missing | |
| 2268 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_190732__504 | 5 | 0.0 | 8.42907 | 2 | [89, 132] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231215_190732__504.json | 100.0 | 0.1 | missing | 0.9 | |
| 2269 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_212946__509 | 5 | 0.00923 | 25.7086 | 2 | [179, 248] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231213_212946__509.json | 100.0 | missing | missing | missing | |
| 2270 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_184051__656 | 5 | 0.00698 | 8.89759 | 2 | [179, 173] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_184051__656.json | 100.0 | missing | missing | missing | |
| 2271 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_184102__948 | 5 | 0.00746 | 10.9655 | 2 | [179, 189] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_184102__948.json | 100.0 | missing | missing | missing | |
| 2272 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_191348__257 | 5 | 0.00893 | 20.2604 | 2 | [179, 238] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_191348__257.json | 100.0 | missing | missing | missing | |
| 2273 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_191400__154 | 5 | 0.00887 | 11.706 | 2 | [179, 236] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_191400__154.json | 100.0 | missing | missing | missing | |
| 2274 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_190723__467 | 5 | 0.0 | 25.6357 | 2 | [179, 262] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231215_190723__467.json | 100.0 | 0.1 | missing | 0.9 | |
| 2275 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_213147__205 | 5 | 0.01287 | 21.8327 | 2 | [312, 325] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231213_213147__205.json | 100.0 | missing | missing | missing | |
| 2276 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_184314__753 | 5 | 0.01005 | 8.42692 | 2 | [312, 231] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_184314__753.json | 100.0 | missing | missing | missing | |
| 2277 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_184327__105 | 5 | 0.01134 | 13.2386 | 2 | [312, 274] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_184327__105.json | 100.0 | missing | missing | missing | |
| 2278 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_191653__416 | 5 | 0.01029 | 17.5708 | 2 | [312, 239] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_191653__416.json | 100.0 | missing | missing | missing | |
| 2279 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_191723__452 | 5 | 0.00873 | 30.3115 | 2 | [312, 187] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_191723__452.json | 100.0 | missing | missing | missing | |
| 2280 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_190922__418 | 5 | 0.0 | 28.9556 | 2 | [312, 308] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231215_190922__418.json | 100.0 | 0.1 | missing | 0.9 | |
| 2281 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_213125__455 | 1 | 0.01115 | 23.7671 | 2 | [311, 268] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231213_213125__455.json | 80.0 | missing | missing | missing | |
| 2282 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_184252__809 | 5 | 0.01382 | 20.3968 | 2 | [311, 357] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_184252__809.json | 100.0 | missing | missing | missing | |
| 2283 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_184305__705 | 5 | 0.01139 | 13.2165 | 2 | [311, 276] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_184305__705.json | 100.0 | missing | missing | missing | |
| 2284 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_191529__836 | 1 | 0.01301 | 16.8607 | 2 | [311, 330] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_191529__836.json | 80.0 | missing | missing | missing | |
| 2285 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_191635__924 | 5 | 0.01769 | 65.7321 | 2 | [311, 486] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_191635__924.json | 100.0 | missing | missing | missing | |
| 2286 | Apple-MacBook-Pro-M1 | count_model_rows | gpt-4-1106-preview--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_190853__915 | 5 | 0.0 | 15.0709 | 2 | [311, 224] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231215_190853__915.json | 100.0 | 0.1 | missing | 0.9 | |
| 2287 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | AsIs | 1SHOT | false | false | 5 | 20231213_233631__570 | 0 | 0.0 | 13.2922 | 0 | [44, 403] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__AsIs__1SHOT__20231213_233631__570.json | 0.0 | missing | missing | missing | |
| 2288 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | AsIs | 1SHOT | false | false | 5 | 20231224_234524__390 | 0 | 0.0 | 17.8215 | 0 | [44, 531] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__AsIs__1SHOT__20231224_234524__390.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2289 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | AsIs | 1SHOT | false | false | 5 | 20231224_234535__800 | 0 | 0.0 | 11.2827 | 0 | [1, 354] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__AsIs__1SHOT__20231224_234535__800.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2290 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | InJulia | 1SHOT | false | false | 5 | 20231213_233618__248 | 0 | 0.0 | 13.2318 | 0 | [60, 399] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__InJulia__1SHOT__20231213_233618__248.json | 0.0 | missing | missing | missing | |
| 2291 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | InJulia | 1SHOT | false | false | 5 | 20231224_234455__228 | 0 | 0.0 | 13.1699 | 0 | [60, 398] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__InJulia__1SHOT__20231224_234455__228.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2292 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | InJulia | 1SHOT | false | false | 5 | 20231224_234506__267 | 0 | 0.0 | 10.9498 | 0 | [1, 344] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__InJulia__1SHOT__20231224_234506__267.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2293 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | InJulia | 1SHOT | true | false | 5 | 20231226_215449__134 | 0 | 0.0 | 8.04132 | 0 | [60, 248] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__InJulia__1SHOT__20231226_215449__134.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2294 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_233605__930 | 0 | 0.0 | 7.85529 | 0 | [90, 229] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaExpertAsk__1SHOT__20231213_233605__930.json | 50.0 | missing | missing | missing | |
| 2295 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_234436__407 | 0 | 0.0 | 7.82815 | 0 | [90, 229] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaExpertAsk__1SHOT__20231224_234436__407.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2296 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_234442__153 | 0 | 0.0 | 5.74822 | 0 | [1, 183] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaExpertAsk__1SHOT__20231224_234442__153.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2297 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231226_215441__596 | 0 | 0.0 | 9.61107 | 0 | [90, 286] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaExpertAsk__1SHOT__20231226_215441__596.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2298 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_233557__202 | 0 | 0.0 | 16.5371 | 0 | [201, 438] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231213_233557__202.json | 50.0 | missing | missing | missing | |
| 2299 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231224_234411__984 | 0 | 0.0 | 21.9894 | 2 | [219, 446] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231224_234411__984.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2300 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231224_234428__372 | 0 | 0.0 | 16.8116 | 0 | [1, 489] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231224_234428__372.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2301 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_215431__783 | 0 | 0.0 | 15.2671 | 0 | [219, 270] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_215431__783.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2302 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231213_233715__967 | 0 | 0.0 | 23.961 | 0 | [11, 642] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231213_233715__967.json | 0.0 | missing | missing | missing | |
| 2303 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231224_234629__220 | 0 | 0.0 | 19.5344 | 0 | [11, 532] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231224_234629__220.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2304 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_234645__778 | 0 | 0.0 | 15.9547 | 0 | [1, 445] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231224_234645__778.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2305 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_215527__136 | 0 | 0.0 | 15.9301 | 0 | [11, 445] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_215527__136.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2306 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231213_233651__679 | 0 | 0.0 | 19.898 | 0 | [361, 462] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaRecapTask__1SHOT__20231213_233651__679.json | 25.0 | missing | missing | missing | |
| 2307 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231224_234555__380 | 0 | 0.0 | 19.3989 | 0 | [361, 450] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaRecapTask__1SHOT__20231224_234555__380.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2308 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231224_234610__411 | 0 | 0.0 | 15.2164 | 0 | [1, 426] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaRecapTask__1SHOT__20231224_234610__411.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2309 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_215511__173 | 0 | 0.0 | 21.8057 | 2 | [361, 520] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaRecapTask__1SHOT__20231226_215511__173.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2310 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | AsIs | 1SHOT | false | false | 5 | 20231213_234541__526 | 0 | 0.0 | 13.6453 | 0 | [44, 413] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__AsIs__1SHOT__20231213_234541__526.json | 0.0 | missing | missing | missing | |
| 2311 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | AsIs | 1SHOT | false | false | 5 | 20231225_000610__627 | 0 | 0.0 | 4.21077 | 0 | [58, 135] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__AsIs__1SHOT__20231225_000610__627.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2312 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | AsIs | 1SHOT | false | false | 5 | 20231225_000615__344 | 0 | 0.0 | 4.30867 | 0 | [58, 138] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__AsIs__1SHOT__20231225_000615__344.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2313 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | InJulia | 1SHOT | true | false | 5 | 20231213_234528__334 | 0 | 0.0 | 22.1687 | 0 | [60, 642] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__InJulia__1SHOT__20231213_234528__334.json | 25.0 | missing | missing | missing | |
| 2314 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_000558__910 | 0 | 0.0 | 8.12041 | 0 | [60, 267] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__InJulia__1SHOT__20231225_000558__910.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2315 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | InJulia | 1SHOT | true | false | 5 | 20231225_000606__669 | 0 | 0.0 | 8.19033 | 0 | [60, 270] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__InJulia__1SHOT__20231225_000606__669.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2316 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | InJulia | 1SHOT | true | true | 5 | 20231226_220208__595 | 0 | 0.0 | 5.94434 | 0 | [60, 194] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__InJulia__1SHOT__20231226_220208__595.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2317 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_234505__551 | 0 | 0.0 | 6.46371 | 0 | [90, 186] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231213_234505__551.json | 50.0 | missing | missing | missing | |
| 2318 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_000541__141 | 1 | 0.0 | 6.13097 | 2 | [100, 191] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_000541__141.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2319 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_000550__445 | 0 | 0.0 | 8.80394 | 0 | [100, 281] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_000550__445.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2320 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_220202__922 | 0 | 0.0 | 8.75688 | 0 | [100, 280] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231226_220202__922.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2321 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_234459__575 | 0 | 0.0 | 15.2193 | 0 | [201, 403] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231213_234459__575.json | 0.0 | missing | missing | missing | |
| 2322 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_000528__451 | 0 | 0.0 | 12.983 | 0 | [211, 204] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_000528__451.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2323 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_000534__850 | 0 | 0.0 | 6.62346 | 0 | [211, 191] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_000534__850.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2324 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_220153__362 | 0 | 0.0 | 14.932 | 0 | [211, 276] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231226_220153__362.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2325 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_234627__972 | 0 | 0.0 | 24.8557 | 0 | [11, 665] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231213_234627__972.json | 50.0 | missing | missing | missing | |
| 2326 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_000642__879 | 0 | 0.0 | 10.3014 | 0 | [364, 283] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_000642__879.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2327 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_000649__625 | 0 | 0.0 | 6.71047 | 0 | [364, 167] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_000649__625.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2328 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_220223__537 | 5 | 0.0 | 8.25833 | 2 | [364, 218] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231226_220223__537.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2329 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_234602__937 | 0 | 0.0 | 20.9137 | 0 | [361, 489] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaRecapTask__1SHOT__20231213_234602__937.json | 0.0 | missing | missing | missing | |
| 2330 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_000622__251 | 0 | 0.0 | 7.673 | 0 | [361, 198] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_000622__251.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2331 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_000631__385 | 0 | 0.0 | 8.94708 | 0 | [361, 240] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_000631__385.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2332 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_220215__781 | 5 | 0.0 | 6.85407 | 2 | [361, 173] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaRecapTask__1SHOT__20231226_220215__781.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2333 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_175048__165 | 0 | 0.0 | 8.9984 | 0 | [60, 174] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_175048__165.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2334 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_175057__429 | 5 | 0.0 | 8.3959 | 2 | [60, 162] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_175057__429.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2335 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_175116__348 | 0 | 0.0 | 19.3707 | 0 | [60, 380] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_175116__348.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2336 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_175014__300 | 0 | 0.0 | 9.92116 | 0 | [100, 186] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_175014__300.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2337 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_175029__470 | 0 | 0.0 | 14.7704 | 0 | [100, 282] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_175029__470.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2338 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_175039__302 | 0 | 0.0 | 10.4279 | 0 | [100, 196] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_175039__302.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2339 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_174934__857 | 0 | 0.0 | 22.0824 | 0 | [211, 393] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_174934__857.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2340 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_174946__637 | 0 | 0.0 | 12.2047 | 0 | [211, 191] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_174946__637.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2341 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_175004__813 | 0 | 0.0 | 17.5841 | 0 | [211, 324] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_175004__813.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2342 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_175215__667 | 0 | 0.0 | 16.5031 | 0 | [364, 283] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_175215__667.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2343 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_175230__935 | 0 | 0.0 | 15.201 | 0 | [364, 258] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_175230__935.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2344 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_175245__183 | 0 | 0.0 | 14.4849 | 0 | [364, 244] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_175245__183.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2345 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_175132__706 | 0 | 0.0 | 15.204 | 0 | [361, 258] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_175132__706.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2346 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_175143__130 | 1 | 0.0 | 11.2641 | 2 | [361, 182] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_175143__130.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2347 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_175158__941 | 0 | 0.0 | 15.0039 | 0 | [361, 254] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_175158__941.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2348 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | AsIs | 1SHOT | false | false | 5 | 20231213_213403__207 | 0 | 0.00198763 | 20.3776 | 0 | [56, 227] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__AsIs__1SHOT__20231213_213403__207.json | 0.0 | missing | missing | missing | |
| 2349 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | AsIs | 1SHOT | false | false | 5 | 20231225_184658__220 | 0 | 0.00372698 | 12.7359 | 0 | [56, 442] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__AsIs__1SHOT__20231225_184658__220.json | 0.0 | missing | missing | missing | |
| 2350 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | AsIs | 1SHOT | false | false | 5 | 20231225_184717__996 | 0 | 0.00350046 | 18.4116 | 0 | [56, 414] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__AsIs__1SHOT__20231225_184717__996.json | 0.0 | missing | missing | missing | |
| 2351 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium--optim | AsIs | 1SHOT | false | false | 5 | 20231215_191211__826 | 0 | 0.0 | 45.6677 | 0 | [56, 505] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__AsIs__1SHOT__20231215_191211__826.json | 0.0 | 0.9 | missing | 0.3 | |
| 2352 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231213_213343__523 | 1 | 0.00184741 | 17.9522 | 2 | [58, 209] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__InJulia__1SHOT__20231213_213343__523.json | 80.0 | missing | missing | missing | |
| 2353 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231225_184641__105 | 1 | 0.0033926 | 13.4489 | 2 | [58, 400] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__InJulia__1SHOT__20231225_184641__105.json | 80.0 | missing | missing | missing | |
| 2354 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231225_184645__626 | 5 | 0.00171797 | 4.39661 | 2 | [58, 193] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__InJulia__1SHOT__20231225_184645__626.json | 100.0 | missing | missing | missing | |
| 2355 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231227_192037__883 | 0 | 0.00166134 | 8.75776 | 0 | [58, 186] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__InJulia__1SHOT__20231227_192037__883.json | 50.0 | missing | missing | missing | |
| 2356 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231227_192043__213 | 5 | 0.00142673 | 6.01317 | 2 | [58, 157] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__InJulia__1SHOT__20231227_192043__213.json | 100.0 | missing | missing | missing | |
| 2357 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium--optim | InJulia | 1SHOT | true | true | 5 | 20231215_191125__171 | 5 | 0.0 | 22.9599 | 2 | [58, 255] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__InJulia__1SHOT__20231215_191125__171.json | 100.0 | 0.9 | missing | 0.3 | |
| 2358 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_213325__788 | 5 | 0.00127585 | 9.82908 | 2 | [98, 125] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231213_213325__788.json | 100.0 | missing | missing | missing | |
| 2359 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_184621__646 | 5 | 0.00130821 | 2.98558 | 2 | [98, 129] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_184621__646.json | 100.0 | missing | missing | missing | |
| 2360 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_184627__480 | 5 | 0.00124349 | 5.76365 | 2 | [98, 121] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_184627__480.json | 100.0 | missing | missing | missing | |
| 2361 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_192016__215 | 0 | 0.00146192 | 7.66296 | 0 | [98, 148] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_192016__215.json | 50.0 | missing | missing | missing | |
| 2362 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_192028__696 | 5 | 0.00141338 | 12.441 | 2 | [98, 142] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_192028__696.json | 100.0 | missing | missing | missing | |
| 2363 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_191102__362 | 5 | 0.0 | 13.0126 | 2 | [98, 122] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231215_191102__362.json | 100.0 | 0.9 | missing | 0.3 | |
| 2364 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_213315__225 | 0 | 0.00295085 | 24.9829 | 0 | [209, 295] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231213_213315__225.json | 50.0 | missing | missing | missing | |
| 2365 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_184554__295 | 5 | 0.00336344 | 30.0977 | 2 | [209, 346] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_184554__295.json | 100.0 | missing | missing | missing | |
| 2366 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_184618__249 | 1 | 0.00508661 | 24.1732 | 2 | [209, 559] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_184618__249.json | 80.0 | missing | missing | missing | |
| 2367 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_191933__319 | 0 | 0.00367895 | 26.5941 | 0 | [209, 385] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_191933__319.json | 25.0 | missing | missing | missing | |
| 2368 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_192008__667 | 1 | 0.00416435 | 35.2362 | 1 | [209, 445] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_192008__667.json | 67.5 | missing | missing | missing | |
| 2369 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_191049__290 | 5 | 0.0 | 37.2921 | 2 | [209, 373] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231215_191049__290.json | 100.0 | 0.9 | missing | 0.3 | |
| 2370 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_213535__914 | 1 | 0.00600668 | 53.0923 | 1 | [361, 622] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231213_213535__914.json | 67.5 | missing | missing | missing | |
| 2371 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_184801__524 | 1 | 0.00562645 | 13.1336 | 2 | [361, 575] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_184801__524.json | 80.0 | missing | missing | missing | |
| 2372 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_184816__687 | 5 | 0.0040489 | 14.5488 | 2 | [361, 380] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_184816__687.json | 100.0 | missing | missing | missing | |
| 2373 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_192158__103 | 5 | 0.00589342 | 34.9647 | 2 | [361, 608] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_192158__103.json | 100.0 | missing | missing | missing | |
| 2374 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_192226__578 | 1 | 0.00343406 | 27.2585 | 1 | [361, 304] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_192226__578.json | 67.5 | missing | missing | missing | |
| 2375 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_191448__285 | 0 | 0.0 | 92.9791 | 0 | [361, 640] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231215_191448__285.json | 50.0 | 0.9 | missing | 0.3 | |
| 2376 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_213441__936 | 5 | 0.00474463 | 38.2799 | 2 | [358, 467] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231213_213441__936.json | 100.0 | missing | missing | missing | |
| 2377 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_184732__833 | 1 | 0.00584487 | 15.0383 | 1 | [358, 603] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_184732__833.json | 67.5 | missing | missing | missing | |
| 2378 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_184748__987 | 0 | 0.00659724 | 15.9394 | 0 | [358, 696] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_184748__987.json | 50.0 | missing | missing | missing | |
| 2379 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_192102__834 | 1 | 0.00768939 | 18.9542 | 2 | [358, 831] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_192102__834.json | 80.0 | missing | missing | missing | |
| 2380 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_192123__988 | 1 | 0.00568307 | 20.685 | 2 | [358, 583] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_192123__988.json | 80.0 | missing | missing | missing | |
| 2381 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-medium--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_191315__658 | 1 | 0.0 | 64.0725 | 2 | [358, 690] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231215_191315__658.json | 80.0 | 0.9 | missing | 0.3 | |
| 2382 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | AsIs | 1SHOT | false | false | 5 | 20231213_213242__489 | 0 | 0.000592365 | 5.44326 | 0 | [55, 287] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__AsIs__1SHOT__20231213_213242__489.json | 0.0 | missing | missing | missing | |
| 2383 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | AsIs | 1SHOT | false | false | 5 | 20231225_184444__259 | 0 | 0.000563265 | 3.75626 | 0 | [55, 272] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__AsIs__1SHOT__20231225_184444__259.json | 0.0 | missing | missing | missing | |
| 2384 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | AsIs | 1SHOT | false | false | 5 | 20231225_184446__283 | 0 | 0.000322705 | 2.17783 | 0 | [55, 148] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__AsIs__1SHOT__20231225_184446__283.json | 0.0 | missing | missing | missing | |
| 2385 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small--optim | AsIs | 1SHOT | false | false | 5 | 20231215_190957__735 | 0 | 0.0 | 5.76926 | 0 | [55, 445] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__AsIs__1SHOT__20231215_190957__735.json | 0.0 | 0.9 | missing | 0.3 | |
| 2386 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231213_213237__205 | 5 | 0.000570379 | 5.12749 | 2 | [57, 275] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__InJulia__1SHOT__20231213_213237__205.json | 100.0 | missing | missing | missing | |
| 2387 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231225_184437__704 | 5 | 0.000430699 | 2.76399 | 2 | [57, 203] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__InJulia__1SHOT__20231225_184437__704.json | 100.0 | missing | missing | missing | |
| 2388 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231225_184440__834 | 1 | 0.000393839 | 2.61391 | 2 | [57, 184] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__InJulia__1SHOT__20231225_184440__834.json | 80.0 | missing | missing | missing | |
| 2389 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231227_191838__513 | 5 | 0.000459799 | 3.06383 | 2 | [57, 218] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__InJulia__1SHOT__20231227_191838__513.json | 100.0 | missing | missing | missing | |
| 2390 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231227_191841__472 | 5 | 0.000430699 | 2.81391 | 2 | [57, 203] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__InJulia__1SHOT__20231227_191841__472.json | 100.0 | missing | missing | missing | |
| 2391 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small--optim | InJulia | 1SHOT | true | true | 5 | 20231215_190952__662 | 5 | 0.0 | 4.11954 | 2 | [57, 310] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__InJulia__1SHOT__20231215_190952__662.json | 100.0 | 0.9 | missing | 0.3 | |
| 2392 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_213232__352 | 1 | 0.000646053 | 6.3038 | 2 | [99, 300] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231213_213232__352.json | 80.0 | missing | missing | missing | |
| 2393 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_184432__598 | 0 | 0.000322073 | 1.93323 | 0 | [99, 133] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_184432__598.json | 50.0 | missing | missing | missing | |
| 2394 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_184435__629 | 1 | 0.000413253 | 2.71918 | 2 | [99, 180] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_184435__629.json | 80.0 | missing | missing | missing | |
| 2395 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_191833__628 | 0 | 0.000630533 | 4.0368 | 0 | [99, 292] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_191833__628.json | 50.0 | missing | missing | missing | |
| 2396 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_191835__796 | 0 | 0.000289093 | 1.73749 | 0 | [99, 116] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_191835__796.json | 50.0 | missing | missing | missing | |
| 2397 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_190947__735 | 1 | 0.0 | 3.79426 | 2 | [99, 273] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231215_190947__735.json | 80.0 | 0.9 | missing | 0.3 | |
| 2398 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_213225__950 | 5 | 0.00082457 | 7.62936 | 2 | [210, 355] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231213_213225__950.json | 100.0 | missing | missing | missing | |
| 2399 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_184424__132 | 0 | 0.00095261 | 5.76439 | 0 | [210, 421] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_184424__132.json | 25.0 | missing | missing | missing | |
| 2400 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_184430__118 | 0 | 0.00092739 | 5.72122 | 0 | [210, 408] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_184430__118.json | 50.0 | missing | missing | missing | |
| 2401 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_191824__700 | 5 | 0.00073533 | 5.94681 | 2 | [210, 309] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_191824__700.json | 100.0 | missing | missing | missing | |
| 2402 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_191829__345 | 5 | 0.00075667 | 4.86734 | 2 | [210, 320] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_191829__345.json | 100.0 | missing | missing | missing | |
| 2403 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_190944__514 | 5 | 0.0 | 4.32591 | 2 | [210, 320] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231215_190944__514.json | 100.0 | 0.9 | missing | 0.3 | |
| 2404 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_213250__506 | 1 | 0.000996635 | 6.21321 | 2 | [365, 392] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231213_213250__506.json | 80.0 | missing | missing | missing | |
| 2405 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_184513__538 | 0 | 0.0011596 | 7.67929 | 0 | [365, 476] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_184513__538.json | 0.0 | missing | missing | missing | |
| 2406 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_184524__192 | 1 | 0.00134778 | 11.1582 | 2 | [365, 573] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_184524__192.json | 80.0 | missing | missing | missing | |
| 2407 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_191858__665 | 0 | 0.000946195 | 5.1395 | 0 | [365, 366] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_191858__665.json | 50.0 | missing | missing | missing | |
| 2408 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_191906__725 | 1 | 0.00143314 | 8.47034 | 2 | [365, 617] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_191906__725.json | 80.0 | missing | missing | missing | |
| 2409 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_191012__426 | 5 | 0.0 | 7.51163 | 2 | [365, 545] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231215_191012__426.json | 100.0 | 0.9 | missing | 0.3 | |
| 2410 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_213243__424 | 0 | 0.000337681 | 0.98892 | 0 | [363, 53] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231213_213243__424.json | 0.0 | missing | missing | missing | |
| 2411 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_184459__927 | 5 | 0.00174612 | 12.4526 | 2 | [363, 779] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_184459__927.json | 100.0 | missing | missing | missing | |
| 2412 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_184505__445 | 0 | 0.00113696 | 6.30493 | 0 | [363, 465] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_184505__445.json | 0.0 | missing | missing | missing | |
| 2413 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_191846__145 | 0 | 0.00103996 | 5.64586 | 0 | [363, 415] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_191846__145.json | 25.0 | missing | missing | missing | |
| 2414 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_191852__835 | 5 | 0.0010613 | 5.90262 | 2 | [363, 426] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_191852__835.json | 100.0 | missing | missing | missing | |
| 2415 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-small--optim | JuliaRecapTask | 1SHOT | false | false | 5 | 20231215_191004__439 | 0 | 0.0 | 6.50011 | 0 | [363, 483] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231215_191004__439.json | 0.0 | 0.9 | missing | 0.3 | |
| 2416 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231213_213208__242 | 0 | 0.000165797 | 5.06628 | 0 | [55, 349] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__AsIs__1SHOT__20231213_213208__242.json | 0.0 | missing | missing | missing | |
| 2417 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231225_184358__645 | 0 | 0.00016625 | 3.05008 | 0 | [55, 350] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__AsIs__1SHOT__20231225_184358__645.json | 0.0 | missing | missing | missing | |
| 2418 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231225_184401__750 | 0 | 0.000133181 | 2.43007 | 0 | [55, 277] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__AsIs__1SHOT__20231225_184401__750.json | 0.0 | missing | missing | missing | |
| 2419 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny--optim | AsIs | 1SHOT | false | false | 5 | 20231215_190934__711 | 0 | 0.0 | 2.78402 | 0 | [55, 315] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__AsIs__1SHOT__20231215_190934__711.json | 0.0 | 0.9 | missing | 0.3 | |
| 2420 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231213_213202__630 | 1 | 0.00017106 | 3.72743 | 1 | [57, 360] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__InJulia__1SHOT__20231213_213202__630.json | 67.5 | missing | missing | missing | |
| 2421 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231225_184353__977 | 0 | 0.000109905 | 1.98411 | 0 | [57, 225] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__InJulia__1SHOT__20231225_184353__977.json | 50.0 | missing | missing | missing | |
| 2422 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231225_184355__653 | 1 | 0.000124401 | 2.31085 | 1 | [57, 257] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__InJulia__1SHOT__20231225_184355__653.json | 67.5 | missing | missing | missing | |
| 2423 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | InJulia | 1SHOT | false | false | 5 | 20231227_191749__750 | 0 | 0.000216813 | 4.11811 | 0 | [57, 461] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__InJulia__1SHOT__20231227_191749__750.json | 0.0 | missing | missing | missing | |
| 2424 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231227_191753__868 | 0 | 0.00022089 | 4.22925 | 0 | [57, 470] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__InJulia__1SHOT__20231227_191753__868.json | 50.0 | missing | missing | missing | |
| 2425 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny--optim | InJulia | 1SHOT | true | true | 5 | 20231215_190931__835 | 1 | 0.0 | 3.32909 | 1 | [57, 376] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__InJulia__1SHOT__20231215_190931__835.json | 67.5 | 0.9 | missing | 0.3 | |
| 2426 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_213159__597 | 0 | 7.3656e-5 | 2.15645 | 0 | [99, 132] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231213_213159__597.json | 50.0 | missing | missing | missing | |
| 2427 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_184349__159 | 0 | 7.8639e-5 | 10.613 | 0 | [99, 143] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_184349__159.json | 50.0 | missing | missing | missing | |
| 2428 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_184350__141 | 1 | 8.7699e-5 | 1.51659 | 1 | [99, 163] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_184350__141.json | 67.5 | missing | missing | missing | |
| 2429 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_191743__179 | 0 | 9.9477e-5 | 7.32826 | 0 | [99, 189] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_191743__179.json | 50.0 | missing | missing | missing | |
| 2430 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_191745__761 | 0 | 7.5468e-5 | 1.41575 | 0 | [99, 136] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_191745__761.json | 50.0 | missing | missing | missing | |
| 2431 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_190927__123 | 0 | 0.0 | 1.01656 | 0 | [99, 109] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231215_190927__123.json | 50.0 | 0.9 | missing | 0.3 | |
| 2432 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_213156__456 | 0 | 0.000223737 | 8.99227 | 0 | [210, 429] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231213_213156__456.json | 0.0 | missing | missing | missing | |
| 2433 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_184335__511 | 0 | 0.000188403 | 7.90726 | 2 | [210, 351] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_184335__511.json | 75.0 | missing | missing | missing | |
| 2434 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_184338__107 | 0 | 0.000158958 | 2.68468 | 0 | [210, 286] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_184338__107.json | 50.0 | missing | missing | missing | |
| 2435 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_191730__497 | 0 | 0.000135855 | 7.37908 | 0 | [210, 235] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_191730__497.json | 50.0 | missing | missing | missing | |
| 2436 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_191736__589 | 0 | 0.000250464 | 5.36743 | 2 | [210, 488] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_191736__589.json | 75.0 | missing | missing | missing | |
| 2437 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny--optim | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231215_190926__618 | 0 | 0.0 | 4.34378 | 0 | [210, 268] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231215_190926__618.json | 0.0 | 0.9 | missing | 0.3 | |
| 2438 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_213218__144 | 0 | 0.000276241 | 4.84818 | 0 | [365, 497] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231213_213218__144.json | 50.0 | missing | missing | missing | |
| 2439 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_184412__660 | 0 | 0.000221428 | 3.27052 | 2 | [365, 376] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_184412__660.json | 75.0 | missing | missing | missing | |
| 2440 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_184418__939 | 1 | 0.000364123 | 6.05859 | 1 | [365, 691] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_184418__939.json | 67.5 | missing | missing | missing | |
| 2441 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_191815__705 | 0 | 0.000370012 | 11.8541 | 0 | [365, 704] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_191815__705.json | 50.0 | missing | missing | missing | |
| 2442 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_191818__745 | 1 | 0.00020512 | 3.2361 | 1 | [365, 340] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_191818__745.json | 67.5 | missing | missing | missing | |
| 2443 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_190939__615 | 1 | 0.0 | 2.61328 | 1 | [365, 291] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231215_190939__615.json | 67.5 | 0.9 | missing | 0.3 | |
| 2444 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_213213__229 | 1 | 0.000241986 | 4.91806 | 1 | [363, 422] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231213_213213__229.json | 67.5 | missing | missing | missing | |
| 2445 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_184405__769 | 0 | 0.000242439 | 3.81194 | 0 | [363, 423] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_184405__769.json | 50.0 | missing | missing | missing | |
| 2446 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_184409__178 | 0 | 0.000247875 | 3.88035 | 0 | [363, 435] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_184409__178.json | 50.0 | missing | missing | missing | |
| 2447 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_191758__311 | 0 | 0.000251952 | 4.21454 | 0 | [363, 444] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_191758__311.json | 50.0 | missing | missing | missing | |
| 2448 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_191803__538 | 0 | 0.000278226 | 4.65588 | 0 | [363, 502] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_191803__538.json | 50.0 | missing | missing | missing | |
| 2449 | Apple-MacBook-Pro-M1 | count_model_rows | mistral-tiny--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_190936__433 | 0 | 0.0 | 2.4779 | 0 | [363, 275] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231215_190936__433.json | 50.0 | 0.9 | missing | 0.3 | |
| 2450 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_215734__967 | 0 | 0.0 | 10.0006 | 0 | [44, 306] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_215734__967.json | 0.0 | missing | missing | missing | |
| 2451 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_215747__732 | 0 | 0.0 | 12.8772 | 0 | [1, 402] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_215747__732.json | 0.0 | missing | missing | missing | |
| 2452 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_215801__299 | 0 | 0.0 | 13.8355 | 0 | [1, 430] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_215801__299.json | 0.0 | missing | missing | missing | |
| 2453 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_003522__577 | 0 | 0.0 | 2.79478 | 0 | [53, 64] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_003522__577.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2454 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_003528__872 | 0 | 0.0 | 5.13002 | 0 | [53, 126] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_003528__872.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2455 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_215710__587 | 0 | 0.0 | 11.7271 | 0 | [1, 368] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_215710__587.json | 25.0 | missing | missing | missing | |
| 2456 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_215724__734 | 0 | 0.0 | 13.4437 | 0 | [1, 418] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_215724__734.json | 0.0 | missing | missing | missing | |
| 2457 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_003514__205 | 1 | 0.0 | 13.5121 | 1 | [56, 343] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_003514__205.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2458 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_003519__296 | 0 | 0.0 | 5.21206 | 0 | [56, 128] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_003519__296.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2459 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231226_221455__488 | 0 | 0.0 | 5.9337 | 0 | [56, 147] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231226_221455__488.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2460 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_215620__882 | 0 | 0.0 | 6.5107 | 2 | [1, 207] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_215620__882.json | 75.0 | missing | missing | missing | |
| 2461 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_215638__188 | 0 | 0.0 | 18.4253 | 0 | [1, 553] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_215638__188.json | 50.0 | missing | missing | missing | |
| 2462 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_003458__681 | 0 | 0.0 | 3.47413 | 0 | [98, 74] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_003458__681.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2463 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_003501__646 | 0 | 0.0 | 2.23043 | 0 | [98, 41] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_003501__646.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2464 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_221449__501 | 0 | 0.0 | 2.80355 | 0 | [98, 56] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_221449__501.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2465 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231219_215550__734 | 0 | 0.0 | 8.98351 | 0 | [1, 272] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_215550__734.json | 50.0 | missing | missing | missing | |
| 2466 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231219_215603__210 | 0 | 0.0 | 12.8947 | 0 | [1, 383] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_215603__210.json | 50.0 | missing | missing | missing | |
| 2467 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_003442__929 | 0 | 0.0 | 15.839 | 0 | [209, 230] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_003442__929.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2468 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_003455__232 | 0 | 0.0 | 13.1729 | 0 | [209, 309] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_003455__232.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2469 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_221446__636 | 0 | 0.0 | 10.7512 | 0 | [209, 110] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_221446__636.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2470 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_215931__417 | 0 | 0.0 | 19.054 | 0 | [1, 526] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_215931__417.json | 50.0 | missing | missing | missing | |
| 2471 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_215942__856 | 0 | 0.0 | 11.1712 | 0 | [1, 319] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_215942__856.json | 50.0 | missing | missing | missing | |
| 2472 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_003600__981 | 0 | 0.0 | 11.5132 | 0 | [365, 242] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_003600__981.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2473 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_003610__916 | 1 | 0.0 | 9.9214 | 1 | [365, 202] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_003610__916.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2474 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_221537__619 | 0 | 0.0 | 20.1622 | 0 | [365, 455] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_221537__619.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2475 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_215838__369 | 0 | 0.0 | 15.346 | 1 | [1, 431] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_215838__369.json | 62.5 | missing | missing | missing | |
| 2476 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_215855__492 | 0 | 0.0 | 17.3169 | 0 | [1, 482] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_215855__492.json | 25.0 | missing | missing | missing | |
| 2477 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_003537__791 | 1 | 0.0 | 9.64523 | 1 | [363, 195] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_003537__791.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2478 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_003549__328 | 0 | 0.0 | 10.8939 | 0 | [363, 226] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_003549__328.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2479 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_221516__380 | 0 | 0.0 | 21.4807 | 0 | [363, 487] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_221516__380.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2480 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_224733__816 | 0 | 0.0 | 10.4385 | 0 | [55, 335] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_224733__816.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2481 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | false | false | 5 | 20231227_224746__855 | 0 | 0.0 | 12.2269 | 0 | [55, 392] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_224746__855.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2482 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_224759__931 | 0 | 0.0 | 13.6106 | 0 | [55, 436] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_224759__931.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2483 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_224816__762 | 0 | 0.0 | 16.7234 | 0 | [55, 535] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_224816__762.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2484 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_224823__819 | 0 | 0.0 | 6.50545 | 0 | [55, 206] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_224823__819.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2485 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_224709__317 | 0 | 0.0 | 2.33556 | 0 | [97, 58] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_224709__317.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2486 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_224712__222 | 1 | 0.0 | 2.20618 | 1 | [97, 54] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_224712__222.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2487 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_224714__157 | 0 | 0.0 | 2.29744 | 0 | [97, 56] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_224714__157.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2488 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_224720__725 | 0 | 0.0 | 5.94903 | 0 | [97, 177] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_224720__725.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2489 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_224723__348 | 0 | 0.0 | 2.21717 | 0 | [97, 54] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_224723__348.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2490 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_224621__198 | 0 | 0.0 | 14.9847 | 2 | [208, 416] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_224621__198.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2491 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_224634__966 | 0 | 0.0 | 13.107 | 0 | [208, 390] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_224634__966.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2492 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_224649__826 | 0 | 0.0 | 15.097 | 2 | [208, 452] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_224649__826.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2493 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_224659__531 | 0 | 0.0 | 9.97539 | 0 | [208, 291] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_224659__531.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2494 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_224707__291 | 0 | 0.0 | 7.68919 | 0 | [208, 218] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_224707__291.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2495 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_224955__265 | 0 | 0.0 | 14.458 | 0 | [364, 401] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_224955__265.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2496 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_225012__671 | 0 | 0.0 | 16.3343 | 0 | [364, 458] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_225012__671.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2497 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_225024__360 | 0 | 0.0 | 12.2992 | 0 | [364, 334] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_225024__360.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2498 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_225041__117 | 0 | 0.0 | 16.6822 | 2 | [364, 469] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_225041__117.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2499 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_225056__555 | 0 | 0.0 | 15.5444 | 0 | [364, 434] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_225056__555.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2500 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_224839__202 | 1 | 0.0 | 15.9437 | 2 | [362, 447] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_224839__202.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2501 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_224858__743 | 0 | 0.0 | 18.4795 | 0 | [362, 522] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_224858__743.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2502 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_224910__688 | 0 | 0.0 | 11.8807 | 0 | [362, 322] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_224910__688.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2503 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_224924__452 | 0 | 0.0 | 14.1274 | 1 | [362, 391] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_224924__452.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2504 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_224941__486 | 0 | 0.0 | 15.7683 | 0 | [362, 441] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_224941__486.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2505 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_225244__808 | 0 | 0.0 | 10.8758 | 0 | [55, 274] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_225244__808.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2506 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231227_225303__729 | 0 | 0.0 | 19.3672 | 0 | [55, 490] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_225303__729.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2507 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_225319__405 | 1 | 0.0 | 15.6836 | 1 | [55, 397] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_225319__405.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2508 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_225337__693 | 1 | 0.0 | 17.4994 | 1 | [55, 443] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_225337__693.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2509 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_225355__678 | 0 | 0.0 | 18.3585 | 0 | [55, 465] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_225355__678.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2510 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_225215__824 | 0 | 0.0 | 6.02155 | 0 | [97, 140] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_225215__824.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2511 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_225221__304 | 0 | 0.0 | 5.92214 | 0 | [97, 137] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_225221__304.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2512 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_225226__473 | 0 | 0.0 | 4.16356 | 0 | [97, 91] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_225226__473.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2513 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_225229__854 | 0 | 0.0 | 3.30812 | 0 | [97, 69] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_225229__854.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2514 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_225233__252 | 0 | 0.0 | 3.45889 | 0 | [97, 73] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_225233__252.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2515 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_225110__628 | 1 | 0.0 | 13.1033 | 1 | [208, 285] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_225110__628.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2516 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_225123__779 | 0 | 0.0 | 13.5967 | 2 | [208, 318] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_225123__779.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2517 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_225135__782 | 0 | 0.0 | 11.7692 | 0 | [208, 272] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_225135__782.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2518 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_225153__940 | 0 | 0.0 | 18.3907 | 0 | [208, 438] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_225153__940.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2519 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_225209__972 | 0 | 0.0 | 15.6115 | 0 | [208, 369] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_225209__972.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2520 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_225546__402 | 0 | 0.0 | 10.5431 | 0 | [364, 216] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_225546__402.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2521 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_225608__907 | 1 | 0.0 | 21.2866 | 1 | [364, 480] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_225608__907.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2522 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_225625__904 | 0 | 0.0 | 16.9681 | 0 | [364, 375] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_225625__904.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2523 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_225636__970 | 0 | 0.0 | 11.8515 | 0 | [364, 249] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_225636__970.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2524 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_225658__673 | 0 | 0.0 | 20.826 | 0 | [364, 469] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_225658__673.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2525 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_225415__375 | 0 | 0.0 | 19.505 | 0 | [362, 437] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_225415__375.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2526 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_225428__884 | 0 | 0.0 | 13.1764 | 0 | [362, 282] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_225428__884.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2527 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_225445__234 | 0 | 0.0 | 16.2774 | 0 | [362, 358] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_225445__234.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2528 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_225509__986 | 0 | 0.0 | 24.4287 | 0 | [362, 556] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_225509__986.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2529 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_225536__595 | 0 | 0.0 | 26.055 | 0 | [362, 595] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_225536__595.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2530 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231226_120315__261 | 0 | 0.0 | 14.2249 | 0 | [52, 261] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_120315__261.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2531 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231226_120328__963 | 0 | 0.0 | 13.1424 | 0 | [52, 241] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_120328__963.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2532 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_120244__330 | 0 | 0.0 | 15.4474 | 0 | [55, 284] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_120244__330.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2533 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_120301__205 | 0 | 0.0 | 16.591 | 0 | [55, 305] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_120301__205.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2534 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231226_221745__734 | 0 | 0.0 | 13.1733 | 0 | [55, 243] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_221745__734.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2535 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_120223__300 | 0 | 0.0 | 10.3157 | 0 | [97, 181] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_120223__300.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2536 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_120228__757 | 0 | 0.0 | 5.80172 | 0 | [97, 96] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_120228__757.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2537 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_221732__259 | 0 | 0.0 | 4.85736 | 0 | [97, 79] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_221732__259.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2538 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_120152__580 | 0 | 0.0 | 20.9227 | 0 | [208, 365] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_120152__580.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2539 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_120212__478 | 0 | 0.0 | 19.9863 | 2 | [208, 348] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_120212__478.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2540 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_221727__446 | 1 | 0.0 | 25.8075 | 1 | [208, 293] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_221727__446.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2541 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_120435__893 | 0 | 0.0 | 22.4704 | 0 | [364, 372] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_120435__893.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2542 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_120503__239 | 1 | 0.0 | 28.1081 | 1 | [364, 473] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_120503__239.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2543 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_221851__107 | 0 | 0.0 | 28.1158 | 0 | [364, 477] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_221851__107.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2544 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_120351__688 | 0 | 0.0 | 22.518 | 0 | [362, 373] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_120351__688.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2545 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_120412__206 | 1 | 0.0 | 21.5563 | 1 | [362, 356] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_120412__206.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2546 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_221823__917 | 1 | 0.0 | 38.0978 | 1 | [362, 655] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_221823__917.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2547 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_103555__565 | 1 | 0.0 | 41.5998 | 1 | [61, 241] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_103555__565.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2548 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_103621__851 | 0 | 0.0 | 25.9109 | 0 | [61, 145] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_103621__851.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2549 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_103712__361 | 0 | 0.0 | 49.9946 | 0 | [61, 276] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_103712__361.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2550 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_142911__277 | 0 | 0.0 | 31.6657 | 0 | [61, 186] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_142911__277.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2551 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_142956__185 | 0 | 0.0 | 44.484 | 0 | [61, 264] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_142956__185.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2552 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_103449__127 | 0 | 0.0 | 13.5708 | 0 | [100, 65] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_103449__127.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2553 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_103500__680 | 0 | 0.0 | 11.3195 | 0 | [100, 51] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_103500__680.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2554 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_103513__767 | 5 | 0.0 | 13.0491 | 2 | [100, 62] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_103513__767.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2555 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_142756__756 | 0 | 0.0 | 8.5501 | 0 | [100, 34] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_142756__756.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2556 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_142839__221 | 0 | 0.0 | 43.2631 | 1 | [100, 247] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_142839__221.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2557 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_103220__806 | 0 | 0.0 | 67.2272 | 0 | [211, 338] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_103220__806.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2558 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_103340__175 | 0 | 0.0 | 80.4441 | 0 | [211, 443] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_103340__175.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2559 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_103435__719 | 1 | 0.0 | 54.744 | 1 | [211, 299] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_103435__719.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2560 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_142731__384 | 0 | 0.0 | 59.987 | 0 | [211, 330] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_142731__384.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2561 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_142747__969 | 0 | 0.0 | 16.0041 | 0 | [211, 65] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_142747__969.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2562 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_104055__291 | 0 | 0.0 | 46.4848 | 0 | [374, 222] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_104055__291.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2563 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_104201__597 | 1 | 0.0 | 65.1742 | 1 | [374, 332] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_104201__597.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2564 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_104309__915 | 0 | 0.0 | 68.0833 | 1 | [374, 349] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_104309__915.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2565 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_143301__236 | 5 | 0.0 | 42.2901 | 2 | [374, 196] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_143301__236.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2566 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_143345__597 | 1 | 0.0 | 43.6346 | 1 | [374, 204] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_143345__597.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2567 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_103804__531 | 0 | 0.0 | 50.9949 | 0 | [372, 237] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_103804__531.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2568 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_103944__955 | 0 | 0.0 | 99.6307 | 0 | [372, 525] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_103944__955.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2569 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_104009__839 | 0 | 0.0 | 24.7735 | 0 | [372, 92] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_104009__839.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2570 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_143049__145 | 0 | 0.0 | 52.9837 | 0 | [372, 259] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_143049__145.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2571 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_143219__477 | 0 | 0.0 | 89.837 | 0 | [372, 474] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_143219__477.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2572 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_220152__846 | 0 | 0.0 | 17.8522 | 0 | [44, 534] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231219_220152__846.json | 0.0 | missing | missing | missing | |
| 2573 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_220210__961 | 0 | 0.0 | 17.3659 | 0 | [1, 530] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231219_220210__961.json | 0.0 | missing | missing | missing | |
| 2574 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_220223__338 | 0 | 0.0 | 12.8167 | 0 | [1, 400] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231219_220223__338.json | 0.0 | missing | missing | missing | |
| 2575 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_003717__525 | 0 | 0.0 | 9.93628 | 0 | [61, 251] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231225_003717__525.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2576 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_003723__404 | 0 | 0.0 | 5.5681 | 0 | [61, 137] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231225_003723__404.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2577 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_220119__798 | 0 | 0.0 | 15.2935 | 0 | [1, 471] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_220119__798.json | 0.0 | missing | missing | missing | |
| 2578 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_220134__443 | 0 | 0.0 | 15.5754 | 0 | [1, 479] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_220134__443.json | 25.0 | missing | missing | missing | |
| 2579 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_003701__366 | 1 | 0.0 | 8.35118 | 1 | [64, 210] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_003701__366.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2580 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_003707__118 | 0 | 0.0 | 6.60528 | 0 | [64, 164] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_003707__118.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2581 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231226_221601__573 | 0 | 0.0 | 6.49109 | 0 | [64, 161] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231226_221601__573.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2582 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_220051__412 | 0 | 0.0 | 3.21347 | 0 | [1, 104] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_220051__412.json | 50.0 | missing | missing | missing | |
| 2583 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_220052__955 | 0 | 0.0 | 1.53799 | 0 | [1, 50] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_220052__955.json | 0.0 | missing | missing | missing | |
| 2584 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_003647__343 | 0 | 0.0 | 2.84394 | 0 | [106, 57] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_003647__343.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2585 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_003652__574 | 0 | 0.0 | 4.68083 | 0 | [106, 105] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_003652__574.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2586 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_221555__517 | 0 | 0.0 | 3.56213 | 0 | [106, 76] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_221555__517.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2587 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_220016__599 | 0 | 0.0 | 16.9062 | 0 | [1, 493] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_220016__599.json | 0.0 | missing | missing | missing | |
| 2588 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_220032__674 | 0 | 0.0 | 15.9453 | 0 | [1, 467] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_220032__674.json | 0.0 | missing | missing | missing | |
| 2589 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_003631__889 | 0 | 0.0 | 20.8787 | 0 | [217, 342] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_003631__889.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2590 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_003644__389 | 0 | 0.0 | 13.2527 | 0 | [217, 310] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_003644__389.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2591 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_221551__431 | 0 | 0.0 | 14.5061 | 0 | [217, 191] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_221551__431.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2592 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_220354__445 | 0 | 0.0 | 23.6206 | 0 | [1, 641] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_220354__445.json | 50.0 | missing | missing | missing | |
| 2593 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_220356__920 | 0 | 0.0 | 1.89288 | 0 | [1, 56] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_220356__920.json | 0.0 | missing | missing | missing | |
| 2594 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_003750__204 | 0 | 0.0 | 10.7489 | 0 | [373, 222] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_003750__204.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2595 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_003802__717 | 1 | 0.0 | 11.4122 | 1 | [373, 239] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_003802__717.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2596 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_221623__540 | 0 | 0.0 | 11.9498 | 0 | [373, 252] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_221623__540.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2597 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_220300__264 | 0 | 0.0 | 19.8496 | 0 | [1, 547] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_220300__264.json | 0.0 | missing | missing | missing | |
| 2598 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_220318__589 | 0 | 0.0 | 17.6938 | 0 | [1, 492] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_220318__589.json | 25.0 | missing | missing | missing | |
| 2599 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_003730__176 | 0 | 0.0 | 7.20025 | 0 | [371, 133] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_003730__176.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2600 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_003740__579 | 0 | 0.0 | 9.22224 | 0 | [371, 184] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_003740__579.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2601 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_221611__745 | 0 | 0.0 | 9.22555 | 0 | [371, 184] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_221611__745.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2602 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231213_233804__143 | 0 | 0.0 | 13.7769 | 0 | [44, 416] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231213_233804__143.json | 0.0 | missing | missing | missing | |
| 2603 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231224_234754__862 | 0 | 0.0 | 3.86947 | 0 | [60, 120] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231224_234754__862.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2604 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231224_234758__143 | 0 | 0.0 | 4.96894 | 0 | [60, 157] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231224_234758__143.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2605 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | InJulia | 1SHOT | true | false | 5 | 20231213_233751__850 | 0 | 0.0 | 13.8492 | 0 | [60, 416] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231213_233751__850.json | 25.0 | missing | missing | missing | |
| 2606 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231224_234738__461 | 1 | 0.0 | 5.86908 | 2 | [62, 188] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231224_234738__461.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2607 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231224_234750__672 | 0 | 0.0 | 11.0488 | 0 | [62, 359] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231224_234750__672.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2608 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231226_215558__283 | 0 | 0.0 | 9.13674 | 0 | [62, 293] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231226_215558__283.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2609 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_233737__854 | 0 | 0.0 | 5.21213 | 0 | [90, 148] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231213_233737__854.json | 50.0 | missing | missing | missing | |
| 2610 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_234726__349 | 0 | 0.0 | 8.27324 | 0 | [104, 257] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231224_234726__349.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2611 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_234732__537 | 0 | 0.0 | 6.50069 | 0 | [104, 199] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231224_234732__537.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2612 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_215549__585 | 0 | 0.0 | 3.75357 | 0 | [104, 107] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231226_215549__585.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2613 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_233731__334 | 0 | 0.0 | 16.0344 | 0 | [201, 425] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231213_233731__334.json | 50.0 | missing | missing | missing | |
| 2614 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231224_234701__993 | 0 | 0.0 | 15.8352 | 0 | [215, 310] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231224_234701__993.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2615 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231224_234717__729 | 1 | 0.0 | 15.7853 | 1 | [215, 480] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231224_234717__729.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2616 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_215545__766 | 0 | 0.0 | 18.6394 | 2 | [215, 403] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231226_215545__766.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2617 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231213_233838__799 | 0 | 0.0 | 16.4617 | 0 | [11, 452] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231213_233838__799.json | 0.0 | missing | missing | missing | |
| 2618 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_234827__593 | 0 | 0.0 | 8.45354 | 0 | [371, 219] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231224_234827__593.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2619 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_234841__322 | 1 | 0.0 | 13.8755 | 2 | [371, 389] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231224_234841__322.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2620 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_215626__275 | 0 | 0.0 | 6.39491 | 0 | [371, 153] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231226_215626__275.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2621 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_233821__466 | 0 | 0.0 | 16.9021 | 0 | [361, 383] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231213_233821__466.json | 50.0 | missing | missing | missing | |
| 2622 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_234809__996 | 1 | 0.0 | 10.5946 | 1 | [369, 286] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231224_234809__996.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2623 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_234818__902 | 0 | 0.0 | 9.20994 | 0 | [369, 243] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231224_234818__902.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2624 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | false | 5 | 20231226_215620__454 | 0 | 0.0 | 21.1848 | 0 | [369, 610] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231226_215620__454.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2625 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231213_234841__947 | 0 | 0.0 | 14.4516 | 0 | [44, 437] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__AsIs__1SHOT__20231213_234841__947.json | 0.0 | missing | missing | missing | |
| 2626 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231225_001034__294 | 0 | 0.0 | 31.2727 | 0 | [61, 571] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__AsIs__1SHOT__20231225_001034__294.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2627 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231225_001048__137 | 0 | 0.0 | 13.6329 | 0 | [61, 249] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__AsIs__1SHOT__20231225_001048__137.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2628 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | InJulia | 1SHOT | true | false | 5 | 20231213_234827__613 | 0 | 0.0 | 13.1533 | 0 | [60, 398] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__InJulia__1SHOT__20231213_234827__613.json | 25.0 | missing | missing | missing | |
| 2629 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_001000__774 | 0 | 0.0 | 18.8294 | 0 | [63, 346] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__InJulia__1SHOT__20231225_001000__774.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2630 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_001003__328 | 0 | 0.0 | 2.84786 | 0 | [63, 44] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__InJulia__1SHOT__20231225_001003__328.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2631 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231226_220408__605 | 0 | 0.0 | 24.4842 | 0 | [63, 451] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__InJulia__1SHOT__20231226_220408__605.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2632 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_234814__418 | 0 | 0.0 | 5.96785 | 0 | [90, 172] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231213_234814__418.json | 50.0 | missing | missing | missing | |
| 2633 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_000929__442 | 0 | 0.0 | 10.2484 | 0 | [103, 176] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_000929__442.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2634 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_000941__640 | 0 | 0.0 | 12.2004 | 0 | [103, 212] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_000941__640.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2635 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_220344__812 | 0 | 0.0 | 9.61763 | 0 | [103, 164] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231226_220344__812.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2636 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_234808__337 | 0 | 0.0 | 13.8162 | 0 | [201, 365] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231213_234808__337.json | 50.0 | missing | missing | missing | |
| 2637 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_000853__196 | 0 | 0.0 | 34.877 | 0 | [214, 429] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_000853__196.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2638 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_000918__218 | 0 | 0.0 | 25.4935 | 0 | [214, 435] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_000918__218.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2639 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_220334__988 | 0 | 0.0 | 37.6627 | 0 | [214, 494] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231226_220334__988.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2640 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_234916__427 | 0 | 0.0 | 14.678 | 0 | [11, 406] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231213_234916__427.json | 50.0 | missing | missing | missing | |
| 2641 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_001152__922 | 0 | 0.0 | 27.7576 | 0 | [367, 442] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_001152__922.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2642 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_001222__951 | 1 | 0.0 | 30.1904 | 1 | [367, 484] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_001222__951.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2643 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_220510__486 | 0 | 0.0 | 42.6368 | 0 | [367, 696] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231226_220510__486.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2644 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_234901__646 | 0 | 0.0 | 20.1089 | 0 | [361, 469] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231213_234901__646.json | 0.0 | missing | missing | missing | |
| 2645 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_001102__594 | 0 | 0.0 | 14.1329 | 0 | [364, 202] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_001102__594.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2646 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_001124__452 | 0 | 0.0 | 21.9011 | 0 | [364, 340] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_001124__452.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2647 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_220427__666 | 0 | 0.0 | 19.4366 | 0 | [364, 298] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231226_220427__666.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2648 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231219_220618__434 | 0 | 0.0 | 15.2394 | 0 | [44, 460] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231219_220618__434.json | 0.0 | missing | missing | missing | |
| 2649 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231219_220631__815 | 0 | 0.0 | 12.6809 | 0 | [1, 396] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231219_220631__815.json | 0.0 | missing | missing | missing | |
| 2650 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231219_220640__723 | 0 | 0.0 | 8.7447 | 0 | [1, 279] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231219_220640__723.json | 0.0 | missing | missing | missing | |
| 2651 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231225_003920__670 | 0 | 0.0 | 16.2482 | 0 | [51, 623] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231225_003920__670.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2652 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231225_003935__826 | 0 | 0.0 | 14.5375 | 0 | [51, 561] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231225_003935__826.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2653 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | true | false | 5 | 20231219_220545__201 | 0 | 0.0 | 18.8485 | 0 | [1, 571] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_220545__201.json | 25.0 | missing | missing | missing | |
| 2654 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231219_220603__798 | 0 | 0.0 | 17.7743 | 0 | [1, 541] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_220603__798.json | 0.0 | missing | missing | missing | |
| 2655 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_003845__635 | 0 | 0.0 | 0.848594 | 0 | [54, 28] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_003845__635.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2656 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_003904__702 | 0 | 0.0 | 19.0177 | 0 | [54, 722] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_003904__702.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2657 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231226_221654__796 | 0 | 0.0 | 1.36434 | 0 | [54, 49] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231226_221654__796.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2658 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_220501__368 | 0 | 0.0 | 13.3985 | 0 | [1, 412] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_220501__368.json | 0.0 | missing | missing | missing | |
| 2659 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_220512__562 | 0 | 0.0 | 11.2781 | 0 | [1, 350] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_220512__562.json | 50.0 | missing | missing | missing | |
| 2660 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_003843__883 | 0 | 0.0 | 1.80896 | 0 | [91, 63] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_003843__883.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2661 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_003844__290 | 0 | 0.0 | 1.16057 | 0 | [91, 36] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_003844__290.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2662 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_221653__798 | 0 | 0.0 | 25.2228 | 0 | [91, 925] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_221653__798.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2663 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231219_220427__284 | 0 | 0.0 | 13.9989 | 0 | [1, 414] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_220427__284.json | 50.0 | missing | missing | missing | |
| 2664 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_220438__328 | 0 | 0.0 | 10.3601 | 0 | [1, 311] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_220438__328.json | 25.0 | missing | missing | missing | |
| 2665 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_003826__758 | 0 | 0.0 | 23.9141 | 0 | [199, 728] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_003826__758.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2666 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_003841__401 | 0 | 0.0 | 15.1561 | 0 | [199, 550] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_003841__401.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2667 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_221627__986 | 0 | 0.0 | 4.64606 | 0 | [199, 22] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_221627__986.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2668 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_220813__341 | 0 | 0.0 | 21.4744 | 0 | [1, 587] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_220813__341.json | 50.0 | missing | missing | missing | |
| 2669 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_220830__598 | 0 | 0.0 | 16.9847 | 0 | [1, 473] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_220830__598.json | 25.0 | missing | missing | missing | |
| 2670 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_004013__520 | 0 | 0.0 | 16.5171 | 0 | [343, 565] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_004013__520.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2671 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_004051__685 | 0 | 0.0 | 37.4903 | 0 | [343, 1243] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_004051__685.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2672 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_221701__304 | 0 | 0.0 | 1.77331 | 0 | [343, 26] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_221701__304.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2673 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_220716__118 | 0 | 0.0 | 17.038 | 0 | [1, 475] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_220716__118.json | 50.0 | missing | missing | missing | |
| 2674 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_220734__791 | 0 | 0.0 | 17.3187 | 0 | [1, 482] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_220734__791.json | 50.0 | missing | missing | missing | |
| 2675 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_003939__570 | 0 | 0.0 | 4.58815 | 0 | [340, 134] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_003939__570.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2676 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_003957__543 | 0 | 0.0 | 17.3381 | 0 | [340, 594] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_003957__543.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2677 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_221659__760 | 0 | 0.0 | 5.00695 | 0 | [340, 150] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_221659__760.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2678 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231213_235005__235 | 0 | 0.0 | 14.2832 | 0 | [44, 432] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231213_235005__235.json | 0.0 | missing | missing | missing | |
| 2679 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231225_001638__248 | 0 | 0.0 | 41.416 | 0 | [69, 320] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_001638__248.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2680 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231225_001709__244 | 0 | 0.0 | 31.9198 | 0 | [69, 244] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_001709__244.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2681 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231213_234951__780 | 0 | 0.0 | 14.1914 | 0 | [60, 428] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231213_234951__780.json | 50.0 | missing | missing | missing | |
| 2682 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | InJulia | 1SHOT | false | false | 5 | 20231225_001522__114 | 0 | 0.0 | 37.2775 | 0 | [71, 287] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_001522__114.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2683 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_001556__112 | 5 | 0.0 | 33.66 | 2 | [71, 258] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_001556__112.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2684 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | InJulia | 1SHOT | false | false | 5 | 20231226_220726__983 | 0 | 0.0 | 39.4345 | 0 | [71, 302] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231226_220726__983.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2685 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_234937__779 | 0 | 0.0 | 6.45624 | 0 | [90, 187] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231213_234937__779.json | 50.0 | missing | missing | missing | |
| 2686 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_001414__539 | 5 | 0.0 | 20.3467 | 2 | [111, 145] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_001414__539.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2687 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_001445__481 | 1 | 0.0 | 31.2445 | 2 | [111, 233] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_001445__481.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2688 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_220647__752 | 0 | 0.0 | 29.3822 | 0 | [111, 217] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231226_220647__752.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2689 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_234930__404 | 0 | 0.0 | 13.9218 | 0 | [201, 368] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231213_234930__404.json | 50.0 | missing | missing | missing | |
| 2690 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_001300__292 | 0 | 0.0 | 38.2029 | 0 | [222, 89] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_001300__292.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2691 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_001353__891 | 0 | 0.0 | 53.0855 | 0 | [222, 387] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_001353__891.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2692 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_220617__839 | 0 | 0.0 | 67.0474 | 0 | [222, 331] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_220617__839.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2693 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231213_235043__441 | 0 | 0.0 | 18.3364 | 0 | [11, 502] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231213_235043__441.json | 25.0 | missing | missing | missing | |
| 2694 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_001920__254 | 0 | 0.0 | 29.7647 | 0 | [375, 175] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_001920__254.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2695 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_002021__816 | 5 | 0.0 | 60.9093 | 2 | [375, 416] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_002021__816.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2696 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231226_220920__197 | 0 | 0.0 | 46.0586 | 0 | [375, 302] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_220920__197.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2697 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_235025__758 | 0 | 0.0 | 19.4897 | 1 | [361, 453] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231213_235025__758.json | 62.5 | missing | missing | missing | |
| 2698 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_001758__851 | 0 | 0.0 | 48.0835 | 0 | [372, 318] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_001758__851.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2699 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_001850__653 | 1 | 0.0 | 52.3759 | 2 | [372, 351] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_001850__653.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2700 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_220834__609 | 5 | 0.0 | 67.6811 | 2 | [372, 468] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231226_220834__609.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2701 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_215315__277 | 0 | 0.0 | 10.7248 | 0 | [44, 328] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231219_215315__277.json | 0.0 | missing | missing | missing | |
| 2702 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_215325__159 | 0 | 0.0 | 10.402 | 0 | [1, 329] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231219_215325__159.json | 0.0 | missing | missing | missing | |
| 2703 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_215341__617 | 0 | 0.0 | 15.9653 | 0 | [1, 491] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231219_215341__617.json | 0.0 | missing | missing | missing | |
| 2704 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_003301__228 | 0 | 0.0 | 19.9923 | 0 | [62, 342] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231225_003301__228.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2705 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_003313__347 | 0 | 0.0 | 11.6474 | 0 | [62, 197] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231225_003313__347.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2706 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_215250__842 | 0 | 0.0 | 13.6183 | 0 | [1, 423] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_215250__842.json | 25.0 | missing | missing | missing | |
| 2707 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_215304__791 | 0 | 0.0 | 14.3747 | 0 | [1, 445] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_215304__791.json | 0.0 | missing | missing | missing | |
| 2708 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231225_003223__891 | 0 | 0.0 | 14.5014 | 0 | [64, 247] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_003223__891.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2709 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_003241__830 | 0 | 0.0 | 17.4855 | 0 | [64, 299] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_003241__830.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2710 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231226_221404__411 | 1 | 0.0 | 18.7704 | 1 | [64, 321] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231226_221404__411.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2711 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_215214__725 | 0 | 0.0 | 6.50602 | 0 | [1, 207] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_215214__725.json | 50.0 | missing | missing | missing | |
| 2712 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_215224__781 | 0 | 0.0 | 9.68038 | 0 | [1, 303] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_215224__781.json | 50.0 | missing | missing | missing | |
| 2713 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_003155__191 | 0 | 0.0 | 10.2821 | 0 | [106, 164] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_003155__191.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2714 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_003209__809 | 0 | 0.0 | 13.4768 | 0 | [106, 220] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_003209__809.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2715 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_221346__670 | 0 | 0.0 | 17.3357 | 0 | [106, 287] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_221346__670.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2716 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231219_215149__670 | 0 | 0.0 | 17.2388 | 0 | [1, 502] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_215149__670.json | 50.0 | missing | missing | missing | |
| 2717 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_215200__649 | 0 | 0.0 | 11.2459 | 0 | [1, 337] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_215200__649.json | 0.0 | missing | missing | missing | |
| 2718 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_003125__396 | 0 | 0.0 | 32.5478 | 0 | [217, 372] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_003125__396.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2719 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_003145__593 | 0 | 0.0 | 19.656 | 0 | [217, 310] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_003145__593.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2720 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_221328__229 | 0 | 0.0 | 26.4703 | 0 | [217, 280] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_221328__229.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2721 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_215510__192 | 1 | 0.0 | 17.1119 | 1 | [1, 476] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_215510__192.json | 67.5 | missing | missing | missing | |
| 2722 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_215531__423 | 0 | 0.0 | 21.0286 | 0 | [1, 576] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_215531__423.json | 25.0 | missing | missing | missing | |
| 2723 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_003410__816 | 0 | 0.0 | 22.0189 | 0 | [373, 324] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_003410__816.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2724 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_003426__309 | 0 | 0.0 | 15.8834 | 0 | [373, 221] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_003426__309.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2725 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_221435__790 | 0 | 0.0 | 15.9167 | 0 | [373, 222] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_221435__790.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2726 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_215415__768 | 0 | 0.0 | 14.9531 | 0 | [1, 420] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_215415__768.json | 50.0 | missing | missing | missing | |
| 2727 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_215433__631 | 0 | 0.0 | 17.5486 | 0 | [1, 488] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_215433__631.json | 50.0 | missing | missing | missing | |
| 2728 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_003331__890 | 0 | 0.0 | 18.0692 | 0 | [371, 258] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_003331__890.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2729 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_003348__523 | 0 | 0.0 | 17.0537 | 0 | [371, 241] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_003348__523.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2730 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_221419__647 | 0 | 0.0 | 14.9166 | 0 | [371, 205] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_221419__647.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2731 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231213_234717__177 | 0 | 0.0 | 10.6312 | 0 | [44, 324] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__AsIs__1SHOT__20231213_234717__177.json | 0.0 | missing | missing | missing | |
| 2732 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231225_000737__483 | 0 | 0.0 | 3.89131 | 0 | [63, 225] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_000737__483.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2733 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231225_000739__664 | 0 | 0.0 | 1.99982 | 0 | [63, 113] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_000739__664.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2734 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231213_234706__781 | 0 | 0.0 | 17.9475 | 0 | [60, 535] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__InJulia__1SHOT__20231213_234706__781.json | 0.0 | missing | missing | missing | |
| 2735 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231225_000727__542 | 0 | 0.0 | 5.68033 | 0 | [66, 323] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_000727__542.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2736 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | InJulia | 1SHOT | true | true | 5 | 20231225_000733__386 | 0 | 0.0 | 6.23973 | 0 | [66, 355] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_000733__386.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2737 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | InJulia | 1SHOT | true | true | 5 | 20231226_220239__331 | 0 | 0.0 | 4.96201 | 0 | [66, 283] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__InJulia__1SHOT__20231226_220239__331.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2738 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_234648__510 | 0 | 0.0 | 6.58183 | 1 | [90, 190] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231213_234648__510.json | 62.5 | missing | missing | missing | |
| 2739 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_000713__171 | 0 | 0.0 | 3.22453 | 0 | [103, 176] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_000713__171.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2740 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_000721__931 | 0 | 0.0 | 8.22005 | 0 | [103, 454] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_000721__931.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2741 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231226_220234__317 | 0 | 0.0 | 2.72004 | 0 | [103, 146] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231226_220234__317.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2742 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_234642__419 | 0 | 0.0 | 14.3645 | 0 | [201, 380] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231213_234642__419.json | 50.0 | missing | missing | missing | |
| 2743 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_000701__904 | 0 | 0.0 | 12.6651 | 0 | [208, 509] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_000701__904.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2744 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_000710__877 | 0 | 0.0 | 8.11393 | 0 | [208, 421] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_000710__877.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2745 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_220232__564 | 0 | 0.0 | 8.61171 | 0 | [208, 308] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231226_220232__564.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2746 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231213_234754__880 | 0 | 0.0 | 17.5133 | 0 | [11, 481] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231213_234754__880.json | 25.0 | missing | missing | missing | |
| 2747 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_000807__901 | 0 | 0.0 | 13.6964 | 0 | [353, 651] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_000807__901.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2748 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_000818__196 | 0 | 0.0 | 11.1766 | 0 | [353, 530] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_000818__196.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2749 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_220256__488 | 0 | 0.0 | 7.17056 | 0 | [353, 330] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231226_220256__488.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2750 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_234736__919 | 0 | 0.0 | 19.0238 | 0 | [361, 441] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231213_234736__919.json | 50.0 | missing | missing | missing | |
| 2751 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_000745__524 | 0 | 0.0 | 6.11028 | 0 | [351, 279] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_000745__524.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2752 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_000753__143 | 0 | 0.0 | 7.929 | 0 | [351, 373] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_000753__143.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2753 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_220249__455 | 0 | 0.0 | 9.12219 | 0 | [351, 432] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231226_220249__455.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2754 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231213_233927__432 | 0 | 0.0 | 10.8635 | 0 | [44, 330] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__AsIs__1SHOT__20231213_233927__432.json | 0.0 | missing | missing | missing | |
| 2755 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231224_234949__716 | 0 | 0.0 | 5.29734 | 0 | [61, 168] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__AsIs__1SHOT__20231224_234949__716.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2756 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231224_234954__911 | 0 | 0.0 | 5.71685 | 0 | [61, 182] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__AsIs__1SHOT__20231224_234954__911.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2757 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | InJulia | 1SHOT | false | false | 5 | 20231213_233916__219 | 0 | 0.0 | 9.82109 | 0 | [60, 297] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__InJulia__1SHOT__20231213_233916__219.json | 0.0 | missing | missing | missing | |
| 2758 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231224_234934__594 | 1 | 0.0 | 7.26865 | 1 | [64, 234] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__InJulia__1SHOT__20231224_234934__594.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2759 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231224_234943__161 | 1 | 0.0 | 8.84993 | 1 | [64, 287] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__InJulia__1SHOT__20231224_234943__161.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2760 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231226_215713__312 | 1 | 0.0 | 14.6576 | 1 | [64, 476] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__InJulia__1SHOT__20231226_215713__312.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2761 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_233907__141 | 0 | 0.0 | 8.93196 | 0 | [90, 261] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231213_233907__141.json | 50.0 | missing | missing | missing | |
| 2762 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_234923__562 | 0 | 0.0 | 5.87122 | 0 | [106, 178] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231224_234923__562.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2763 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_234927__174 | 0 | 0.0 | 4.15057 | 0 | [106, 120] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231224_234927__174.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2764 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_215658__746 | 1 | 0.0 | 12.1175 | 2 | [106, 383] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231226_215658__746.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2765 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_233858__375 | 0 | 0.0 | 19.672 | 0 | [201, 523] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231213_233858__375.json | 0.0 | missing | missing | missing | |
| 2766 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231224_234901__192 | 1 | 0.0 | 20.4737 | 1 | [217, 450] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231224_234901__192.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2767 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_234917__614 | 0 | 0.0 | 15.2863 | 0 | [217, 463] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231224_234917__614.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2768 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_215646__175 | 0 | 0.0 | 19.577 | 0 | [217, 438] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231226_215646__175.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2769 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_234001__338 | 1 | 0.0 | 12.0508 | 1 | [11, 316] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231213_234001__338.json | 67.5 | missing | missing | missing | |
| 2770 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231224_235037__550 | 0 | 0.0 | 7.94165 | 0 | [373, 201] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231224_235037__550.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2771 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231224_235105__166 | 0 | 0.0 | 27.8619 | 0 | [373, 806] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231224_235105__166.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2772 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_215729__677 | 0 | 0.0 | 8.53079 | 0 | [373, 221] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231226_215729__677.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2773 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_233948__180 | 0 | 0.0 | 21.1216 | 0 | [361, 491] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231213_233948__180.json | 50.0 | missing | missing | missing | |
| 2774 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_235015__430 | 0 | 0.0 | 20.9847 | 0 | [371, 604] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231224_235015__430.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2775 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_235029__842 | 0 | 0.0 | 13.3196 | 0 | [371, 371] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231224_235029__842.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2776 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_215721__942 | 1 | 0.0 | 7.73376 | 1 | [371, 196] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231226_215721__942.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2777 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231213_234044__145 | 0 | 0.0 | 9.68611 | 0 | [44, 295] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__AsIs__1SHOT__20231213_234044__145.json | 0.0 | missing | missing | missing | |
| 2778 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231224_235630__420 | 0 | 0.0 | 56.4985 | 0 | [58, 430] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__AsIs__1SHOT__20231224_235630__420.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2779 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231224_235704__586 | 0 | 0.0 | 34.4761 | 0 | [58, 261] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__AsIs__1SHOT__20231224_235704__586.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2780 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231213_234035__395 | 0 | 0.0 | 15.5552 | 0 | [60, 467] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__InJulia__1SHOT__20231213_234035__395.json | 25.0 | missing | missing | missing | |
| 2781 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231224_235418__521 | 0 | 0.0 | 45.7727 | 0 | [61, 348] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__InJulia__1SHOT__20231224_235418__521.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2782 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231224_235533__135 | 1 | 0.0 | 75.1738 | 2 | [61, 570] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__InJulia__1SHOT__20231224_235533__135.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2783 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231226_215927__917 | 0 | 0.0 | 44.8418 | 0 | [61, 342] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__InJulia__1SHOT__20231226_215927__917.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2784 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231213_234019__569 | 0 | 0.0 | 2.3851 | 0 | [90, 57] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231213_234019__569.json | 0.0 | missing | missing | missing | |
| 2785 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_235303__947 | 0 | 0.0 | 20.1016 | 0 | [100, 138] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231224_235303__947.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2786 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_235332__979 | 0 | 0.0 | 28.9181 | 0 | [100, 207] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231224_235332__979.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2787 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_215842__317 | 0 | 0.0 | 31.3383 | 0 | [100, 226] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231226_215842__317.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2788 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_234017__224 | 0 | 0.0 | 16.0019 | 0 | [201, 424] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231213_234017__224.json | 50.0 | missing | missing | missing | |
| 2789 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231224_235155__921 | 0 | 0.0 | 50.6081 | 0 | [211, 167] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231224_235155__921.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2790 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231224_235243__249 | 1 | 0.0 | 47.1463 | 2 | [211, 328] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231224_235243__249.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2791 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_215811__590 | 0 | 0.0 | 41.2627 | 0 | [211, 109] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231226_215811__590.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2792 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_234126__382 | 0 | 0.0 | 21.2596 | 0 | [11, 574] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231213_234126__382.json | 50.0 | missing | missing | missing | |
| 2793 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_000004__412 | 0 | 0.0 | 73.3634 | 0 | [374, 488] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_000004__412.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2794 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_000043__370 | 0 | 0.0 | 39.1117 | 0 | [374, 237] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_000043__370.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2795 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_220035__337 | 0 | 0.0 | 14.4288 | 0 | [374, 51] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231226_220035__337.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2796 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_234105__321 | 0 | 0.0 | 20.1889 | 0 | [361, 470] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231213_234105__321.json | 50.0 | missing | missing | missing | |
| 2797 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231224_235741__837 | 0 | 0.0 | 36.5412 | 0 | [372, 218] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231224_235741__837.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2798 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231224_235850__987 | 0 | 0.0 | 69.642 | 0 | [372, 462] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231224_235850__987.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2799 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_220021__784 | 1 | 0.0 | 53.8136 | 1 | [372, 348] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231226_220021__784.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 2800 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231214_000100__811 | 0 | 0.0 | 26.235 | 0 | [138, 721] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231214_000100__811.json | 0.0 | missing | missing | missing | |
| 2801 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | AsIs | 1SHOT | true | true | 5 | 20231225_010756__549 | 0 | 0.0 | 22.5883 | 0 | [160, 397] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_010756__549.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2802 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231225_010821__141 | 0 | 0.0 | 24.4656 | 0 | [160, 431] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_010821__141.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2803 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | InJulia | 1SHOT | false | false | 5 | 20231214_000034__401 | 0 | 0.0 | 24.5438 | 0 | [155, 675] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231214_000034__401.json | 0.0 | missing | missing | missing | |
| 2804 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_010714__313 | 0 | 0.0 | 23.4561 | 0 | [163, 408] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_010714__313.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2805 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_010733__632 | 0 | 0.0 | 19.4932 | 0 | [163, 336] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_010733__632.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2806 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231226_223025__594 | 4 | 0.0 | 27.1754 | 4 | [163, 477] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231226_223025__594.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2807 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_000009__479 | 0 | 0.0 | 15.5025 | 0 | [184, 420] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231214_000009__479.json | 0.0 | missing | missing | missing | |
| 2808 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_010632__632 | 2 | 0.0 | 19.0882 | 3 | [201, 323] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_010632__632.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2809 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_010650__127 | 2 | 0.0 | 18.1504 | 3 | [201, 306] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_010650__127.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2810 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_222958__800 | 2 | 0.0 | 19.7778 | 4 | [201, 337] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231226_222958__800.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2811 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231213_235953__951 | 0 | 0.0 | 13.0769 | 0 | [280, 315] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231213_235953__951.json | 25.0 | missing | missing | missing | |
| 2812 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_010554__101 | 2 | 0.0 | 37.7664 | 3 | [298, 458] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_010554__101.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2813 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_010613__606 | 2 | 0.0 | 18.4311 | 3 | [298, 292] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_010613__606.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2814 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_222938__203 | 0 | 0.0 | 24.8798 | 0 | [298, 230] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231226_222938__203.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2815 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_000141__760 | 0 | 0.0 | 25.9064 | 0 | [11, 673] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231214_000141__760.json | 0.0 | missing | missing | missing | |
| 2816 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_010926__885 | 1 | 0.0 | 25.6768 | 4 | [466, 388] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_010926__885.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2817 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_010955__455 | 0 | 0.0 | 28.7018 | 0 | [466, 440] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_010955__455.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2818 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_223123__495 | 0 | 0.0 | 31.0197 | 0 | [466, 481] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231226_223123__495.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2819 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_000115__902 | 0 | 0.0 | 14.9184 | 0 | [455, 288] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231214_000115__902.json | 0.0 | missing | missing | missing | |
| 2820 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_010836__796 | 0 | 0.0 | 15.2939 | 0 | [463, 207] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_010836__796.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2821 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_010901__177 | 2 | 0.0 | 24.6206 | 5 | [463, 370] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_010901__177.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2822 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_223052__451 | 2 | 0.0 | 26.4461 | 4 | [463, 403] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231226_223052__451.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2823 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_230552__615 | 4 | 0.0 | 3.18639 | 5 | [0, 240] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_230552__615.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2824 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_230555__609 | 0 | 0.0 | 2.91533 | 0 | [0, 220] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_230555__609.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2825 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_230600__931 | 0 | 0.0 | 4.11288 | 3 | [0, 309] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_230600__931.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2826 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_230604__682 | 0 | 0.0 | 3.98032 | 0 | [0, 299] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_230604__682.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2827 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_230608__429 | 4 | 0.0 | 4.15092 | 5 | [0, 310] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_230608__429.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2828 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_230515__526 | 2 | 0.0 | 1.96206 | 4 | [0, 148] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_230515__526.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2829 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_230518__955 | 0 | 0.0 | 3.08279 | 0 | [0, 232] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_230518__955.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2830 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_230523__133 | 2 | 0.0 | 4.01553 | 3 | [0, 301] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_230523__133.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2831 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_230525__902 | 0 | 0.0 | 2.09635 | 0 | [0, 158] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_230525__902.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2832 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_230527__292 | 2 | 0.0 | 2.22809 | 4 | [0, 168] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_230527__292.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2833 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_230438__506 | 2 | 0.0 | 4.6839 | 3 | [0, 345] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_230438__506.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2834 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_230442__548 | 1 | 0.0 | 4.48051 | 1 | [0, 327] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_230442__548.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2835 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_230447__250 | 1 | 0.0 | 4.25864 | 1 | [0, 312] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_230447__250.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2836 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_230451__164 | 2 | 0.0 | 3.65539 | 3 | [0, 268] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_230451__164.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2837 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_230456__272 | 2 | 0.0 | 5.39096 | 2 | [0, 393] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_230456__272.json | 70.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2838 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_230732__786 | 0 | 0.0 | 6.07261 | 0 | [0, 437] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_230732__786.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2839 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_230738__965 | 0 | 0.0 | 6.22345 | 0 | [0, 442] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_230738__965.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2840 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_230743__829 | 3 | 0.0 | 5.40613 | 4 | [0, 390] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_230743__829.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2841 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_230749__608 | 0 | 0.0 | 4.99931 | 0 | [0, 364] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_230749__608.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2842 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_230754__664 | 3 | 0.0 | 5.26743 | 5 | [0, 383] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_230754__664.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2843 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_230632__392 | 0 | 0.0 | 3.28908 | 0 | [0, 240] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_230632__392.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2844 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_230637__946 | 0 | 0.0 | 4.84725 | 0 | [0, 353] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_230637__946.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2845 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_230642__779 | 0 | 0.0 | 5.10022 | 0 | [0, 371] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_230642__779.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2846 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_230647__210 | 2 | 0.0 | 4.44351 | 4 | [0, 324] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_230647__210.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2847 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_230653__883 | 0 | 0.0 | 6.59921 | 0 | [0, 476] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_230653__883.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2848 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231214_000306__968 | 0 | 0.0 | 17.4978 | 0 | [138, 489] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__AsIs__1SHOT__20231214_000306__968.json | 0.0 | missing | missing | missing | |
| 2849 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231225_011141__370 | 0 | 0.0 | 21.6953 | 0 | [134, 383] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__AsIs__1SHOT__20231225_011141__370.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2850 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231225_011150__190 | 0 | 0.0 | 9.18818 | 0 | [134, 151] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__AsIs__1SHOT__20231225_011150__190.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2851 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | InJulia | 1SHOT | true | true | 5 | 20231214_000248__517 | 0 | 0.0 | 25.2563 | 0 | [155, 693] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__InJulia__1SHOT__20231214_000248__517.json | 50.0 | missing | missing | missing | |
| 2852 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_011051__687 | 0 | 0.0 | 8.50558 | 0 | [137, 138] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_011051__687.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2853 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_011119__710 | 0 | 0.0 | 28.5189 | 0 | [137, 506] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_011119__710.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2854 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_000223__421 | 0 | 0.0 | 19.2196 | 0 | [184, 521] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231214_000223__421.json | 0.0 | missing | missing | missing | |
| 2855 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_011028__911 | 0 | 0.0 | 9.83538 | 0 | [138, 163] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_011028__911.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2856 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_011042__308 | 0 | 0.0 | 14.3757 | 0 | [138, 248] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_011042__308.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2857 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_000204__552 | 0 | 0.0 | 23.0983 | 0 | [280, 582] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231214_000204__552.json | 50.0 | missing | missing | missing | |
| 2858 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_011012__587 | 0 | 0.0 | 16.8307 | 0 | [173, 97] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_011012__587.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2859 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_011018__225 | 0 | 0.0 | 5.9643 | 0 | [173, 84] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_011018__225.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2860 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_000403__460 | 0 | 0.0 | 27.5707 | 5 | [11, 712] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231214_000403__460.json | 75.0 | missing | missing | missing | |
| 2861 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_011222__614 | 0 | 0.0 | 13.7206 | 0 | [155, 235] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_011222__614.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2862 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_011224__150 | 0 | 0.0 | 1.83038 | 0 | [155, 10] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_011224__150.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2863 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_000336__922 | 0 | 0.0 | 29.6447 | 5 | [455, 660] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231214_000336__922.json | 75.0 | missing | missing | missing | |
| 2864 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_011157__310 | 0 | 0.0 | 6.32042 | 0 | [152, 96] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_011157__310.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2865 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_011209__851 | 0 | 0.0 | 11.995 | 0 | [152, 203] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_011209__851.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2866 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_231551__786 | 0 | 0.0 | 10.3904 | 0 | [0, 374] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_231551__786.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2867 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240131_231605__514 | 0 | 0.0 | 13.15 | 0 | [0, 469] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_231605__514.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2868 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_231615__533 | 5 | 0.0 | 9.84461 | 5 | [0, 352] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_231615__533.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2869 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_231632__206 | 1 | 0.0 | 17.0004 | 4 | [0, 606] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_231632__206.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2870 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_231643__666 | 4 | 0.0 | 11.3723 | 4 | [0, 408] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_231643__666.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2871 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_231357__194 | 0 | 0.0 | 14.0995 | 0 | [0, 502] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_231357__194.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2872 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_231409__777 | 2 | 0.0 | 11.3993 | 3 | [0, 407] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_231409__777.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2873 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_231418__824 | 0 | 0.0 | 9.37018 | 0 | [0, 336] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_231418__824.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2874 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_231424__190 | 2 | 0.0 | 5.33865 | 4 | [0, 192] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_231424__190.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2875 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240131_231435__862 | 0 | 0.0 | 11.2236 | 0 | [0, 402] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_231435__862.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2876 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_231208__778 | 0 | 0.0 | 6.27745 | 0 | [0, 225] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_231208__778.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2877 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_231220__189 | 3 | 0.0 | 12.3291 | 5 | [0, 438] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_231220__189.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2878 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_231230__349 | 2 | 0.0 | 9.40834 | 4 | [0, 332] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_231230__349.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2879 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_231245__299 | 0 | 0.0 | 14.8485 | 0 | [0, 522] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_231245__299.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2880 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_231253__164 | 4 | 0.0 | 7.68956 | 4 | [0, 273] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_231253__164.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2881 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_231912__718 | 0 | 0.0 | 17.5452 | 0 | [0, 621] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_231912__718.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2882 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_231938__245 | 3 | 0.0 | 26.0164 | 5 | [0, 920] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_231938__245.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2883 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_231943__399 | 4 | 0.0 | 5.12687 | 4 | [0, 182] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_231943__399.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2884 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_231944__143 | 0 | 0.0 | 0.143066 | 0 | [0, 5] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_231944__143.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2885 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_231944__641 | 0 | 0.0 | 0.143302 | 0 | [0, 5] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_231944__641.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2886 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_231735__985 | 4 | 0.0 | 6.12449 | 4 | [0, 217] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_231735__985.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2887 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240131_231742__166 | 0 | 0.0 | 6.25316 | 0 | [0, 222] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_231742__166.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2888 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_231759__663 | 0 | 0.0 | 17.3133 | 0 | [0, 612] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_231759__663.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2889 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_231808__273 | 5 | 0.0 | 8.37434 | 5 | [0, 295] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_231808__273.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2890 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_231815__361 | 3 | 0.0 | 6.94533 | 4 | [0, 246] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_231815__361.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2891 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 5 | 20240131_225554__676 | 0 | 0.0 | 23.0479 | 5 | [0, 560] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240131_225554__676.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2892 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240131_225559__107 | 0 | 0.0 | 5.17599 | 0 | [0, 127] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240131_225559__107.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2893 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 5 | 20240131_225618__763 | 0 | 0.0 | 18.7034 | 1 | [0, 456] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240131_225618__763.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2894 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 5 | 20240131_225639__260 | 3 | 0.0 | 21.079 | 5 | [0, 513] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240131_225639__260.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2895 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240131_225708__622 | 0 | 0.0 | 29.1209 | 0 | [0, 706] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240131_225708__622.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2896 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_225402__147 | 0 | 0.0 | 22.0597 | 0 | [0, 535] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_225402__147.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2897 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_225408__686 | 0 | 0.0 | 5.86737 | 0 | [0, 143] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_225408__686.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2898 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_225417__540 | 0 | 0.0 | 9.20872 | 0 | [0, 225] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_225417__540.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2899 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_225426__562 | 0 | 0.0 | 8.79314 | 0 | [0, 215] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_225426__562.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2900 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_225441__112 | 0 | 0.0 | 15.3647 | 0 | [0, 375] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_225441__112.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2901 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_225100__925 | 2 | 0.0 | 23.4321 | 3 | [0, 567] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_225100__925.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2902 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_225129__445 | 4 | 0.0 | 28.9535 | 5 | [0, 700] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_225129__445.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2903 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_225139__634 | 0 | 0.0 | 10.0527 | 0 | [0, 245] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_225139__634.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2904 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_225146__156 | 0 | 0.0 | 7.42906 | 0 | [0, 181] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_225146__156.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2905 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_225201__286 | 4 | 0.0 | 14.5717 | 5 | [0, 352] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_225201__286.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2906 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_230218__436 | 1 | 0.0 | 26.3094 | 5 | [0, 632] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240131_230218__436.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2907 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_230251__549 | 1 | 0.0 | 33.3292 | 4 | [0, 799] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240131_230251__549.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2908 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_230259__320 | 0 | 0.0 | 8.04928 | 0 | [0, 193] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240131_230259__320.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2909 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_230327__807 | 3 | 0.0 | 28.1238 | 5 | [0, 675] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240131_230327__807.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2910 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_230406__840 | 3 | 0.0 | 38.6026 | 4 | [0, 925] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240131_230406__840.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2911 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_225913__671 | 2 | 0.0 | 22.5049 | 4 | [0, 540] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240131_225913__671.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2912 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_225926__758 | 4 | 0.0 | 12.5061 | 5 | [0, 301] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240131_225926__758.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2913 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240131_225942__524 | 0 | 0.0 | 15.5182 | 0 | [0, 373] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240131_225942__524.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2914 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240131_225943__767 | 0 | 0.0 | 0.789959 | 0 | [0, 19] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240131_225943__767.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2915 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_230034__154 | 4 | 0.0 | 51.1587 | 5 | [0, 1225] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240131_230034__154.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2916 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240131_223533__624 | 0 | 0.0 | 5.36465 | 0 | [0, 100] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_223533__624.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2917 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240131_223548__196 | 0 | 0.0 | 14.7088 | 0 | [0, 275] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_223548__196.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2918 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_223614__691 | 4 | 0.0 | 26.4537 | 5 | [0, 492] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_223614__691.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2919 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240131_223627__677 | 0 | 0.0 | 12.0321 | 0 | [0, 224] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_223627__677.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2920 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_223649__365 | 5 | 0.0 | 22.7098 | 5 | [0, 421] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_223649__365.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2921 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_223244__543 | 0 | 0.0 | 10.5624 | 0 | [0, 197] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_223244__543.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2922 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_223306__220 | 4 | 0.0 | 21.9329 | 5 | [0, 408] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_223306__220.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2923 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_223318__865 | 0 | 0.0 | 12.101 | 0 | [0, 225] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_223318__865.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2924 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_223331__255 | 0 | 0.0 | 12.7875 | 4 | [0, 238] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_223331__255.json | 70.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2925 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_223402__774 | 0 | 0.0 | 30.4245 | 0 | [0, 564] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_223402__774.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2926 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_222941__219 | 0 | 0.0 | 31.1492 | 0 | [0, 574] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_222941__219.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2927 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_223003__980 | 4 | 0.0 | 22.8582 | 5 | [0, 422] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_223003__980.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2928 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_223028__794 | 4 | 0.0 | 23.9458 | 5 | [0, 442] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_223028__794.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2929 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_223048__528 | 0 | 0.0 | 18.4939 | 0 | [0, 342] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_223048__528.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2930 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_223106__247 | 0 | 0.0 | 18.0844 | 0 | [0, 335] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_223106__247.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2931 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_224600__732 | 0 | 0.0 | 25.4457 | 0 | [0, 467] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_224600__732.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2932 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_224621__476 | 0 | 0.0 | 20.7296 | 0 | [0, 381] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_224621__476.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2933 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_224639__123 | 0 | 0.0 | 17.9907 | 0 | [0, 330] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_224639__123.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2934 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_224830__973 | 0 | 0.0 | 111.465 | 0 | [0, 2027] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_224830__973.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2935 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_224852__662 | 0 | 0.0 | 21.6583 | 0 | [471, 389] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_224852__662.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2936 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_223932__654 | 1 | 0.0 | 23.7269 | 1 | [0, 435] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_223932__654.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2937 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20240131_224041__971 | 0 | 0.0 | 69.2311 | 0 | [0, 1264] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_224041__971.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2938 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240131_224106__345 | 0 | 0.0 | 24.983 | 0 | [0, 455] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_224106__345.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2939 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240131_224153__145 | 0 | 0.0 | 46.922 | 0 | [0, 855] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_224153__145.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2940 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_224222__183 | 3 | 0.0 | 28.698 | 5 | [0, 527] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_224222__183.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2941 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_230911__937 | 2 | 0.0 | 3.15061 | 5 | [0, 379] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_230911__937.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2942 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_230914__734 | 1 | 0.0 | 2.20575 | 1 | [0, 266] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_230914__734.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2943 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_230917__856 | 3 | 0.0 | 2.93849 | 5 | [0, 354] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_230917__856.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2944 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_230919__417 | 2 | 0.0 | 2.55179 | 5 | [0, 308] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_230919__417.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2945 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_230922__622 | 2 | 0.0 | 2.06242 | 5 | [0, 248] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_230922__622.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2946 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_230841__689 | 2 | 0.0 | 1.44135 | 3 | [0, 166] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_230841__689.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2947 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_230843__291 | 3 | 0.0 | 2.34455 | 5 | [0, 274] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_230843__291.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2948 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_230847__857 | 3 | 0.0 | 3.23261 | 5 | [0, 380] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_230847__857.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2949 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_230849__441 | 3 | 0.0 | 1.75595 | 5 | [0, 212] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_230849__441.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2950 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_230851__266 | 1 | 0.0 | 2.36682 | 1 | [0, 285] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_230851__266.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2951 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_230817__582 | 2 | 0.0 | 3.48713 | 5 | [0, 414] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_230817__582.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2952 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_230819__805 | 2 | 0.0 | 1.75025 | 5 | [0, 210] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_230819__805.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2953 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_230822__506 | 2 | 0.0 | 2.83312 | 5 | [0, 337] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_230822__506.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2954 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_230824__759 | 1 | 0.0 | 1.78597 | 1 | [0, 214] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_230824__759.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2955 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_230826__870 | 3 | 0.0 | 2.43914 | 5 | [0, 291] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_230826__870.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2956 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_231020__897 | 0 | 0.0 | 7.74519 | 0 | [0, 885] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_231020__897.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2957 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_231022__863 | 2 | 0.0 | 2.69813 | 5 | [0, 316] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_231022__863.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2958 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_231028__978 | 4 | 0.0 | 5.4094 | 5 | [0, 627] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_231028__978.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2959 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_231031__620 | 1 | 0.0 | 2.66487 | 1 | [0, 310] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_231031__620.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2960 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240131_231035__969 | 0 | 0.0 | 4.54487 | 0 | [0, 528] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_231035__969.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2961 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_230938__849 | 1 | 0.0 | 2.08107 | 1 | [0, 233] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_230938__849.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2962 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_230943__242 | 0 | 0.0 | 4.9141 | 0 | [0, 559] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_230943__242.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2963 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_230946__704 | 4 | 0.0 | 3.3654 | 5 | [0, 387] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_230946__704.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2964 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_230951__861 | 2 | 0.0 | 4.13067 | 5 | [0, 470] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_230951__861.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2965 | NVIDIA-RTX-4090-4x | weather_data_analyzer | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_230954__377 | 1 | 0.0 | 3.83225 | 3 | [0, 434] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_230954__377.json | 70.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2966 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_221220__144 | 0 | 0.0 | 24.342 | 0 | [138, 674] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_221220__144.json | 0.0 | missing | missing | missing | |
| 2967 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_221246__371 | 0 | 0.0 | 25.5574 | 0 | [1, 731] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_221246__371.json | 0.0 | missing | missing | missing | |
| 2968 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_221306__293 | 0 | 0.0 | 20.0793 | 0 | [1, 588] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_221306__293.json | 0.0 | missing | missing | missing | |
| 2969 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | true | false | 5 | 20231225_014003__956 | 0 | 0.0 | 102.288 | 0 | [160, 604] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_014003__956.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2970 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | true | true | 5 | 20231225_014050__783 | 4 | 0.0 | 46.4615 | 4 | [160, 267] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_014050__783.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2971 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231219_221134__941 | 1 | 0.0 | 28.3444 | 1 | [1, 801] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_221134__941.json | 60.0 | missing | missing | missing | |
| 2972 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_221156__764 | 0 | 0.0 | 22.1811 | 0 | [1, 643] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_221156__764.json | 0.0 | missing | missing | missing | |
| 2973 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_013715__311 | 2 | 0.0 | 76.708 | 3 | [163, 446] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_013715__311.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2974 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_013820__735 | 2 | 0.0 | 64.5219 | 3 | [163, 373] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_013820__735.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2975 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231226_224506__860 | 4 | 0.0 | 61.8166 | 5 | [163, 358] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231226_224506__860.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2976 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_221035__970 | 0 | 0.0 | 19.8304 | 0 | [1, 574] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_221035__970.json | 50.0 | missing | missing | missing | |
| 2977 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_221047__530 | 0 | 0.0 | 12.7103 | 4 | [1, 380] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_221047__530.json | 70.0 | missing | missing | missing | |
| 2978 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_013454__633 | 2 | 0.0 | 48.0312 | 5 | [204, 266] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_013454__633.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2979 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_013558__938 | 4 | 0.0 | 63.4225 | 5 | [204, 360] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_013558__938.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2980 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_224403__937 | 0 | 0.0 | 70.0414 | 3 | [204, 401] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_224403__937.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2981 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231219_220931__302 | 1 | 0.0 | 32.1639 | 1 | [1, 862] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_220931__302.json | 60.0 | missing | missing | missing | |
| 2982 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231219_220959__913 | 1 | 0.0 | 27.8832 | 1 | [1, 760] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_220959__913.json | 60.0 | missing | missing | missing | |
| 2983 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_013307__768 | 0 | 0.0 | 99.7698 | 5 | [299, 391] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_013307__768.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2984 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_013406__964 | 1 | 0.0 | 58.5001 | 1 | [299, 313] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_013406__964.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2985 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_224253__454 | 0 | 0.0 | 101.699 | 5 | [299, 434] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_224253__454.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2986 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_221525__241 | 0 | 0.0 | 24.0577 | 0 | [1, 635] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_221525__241.json | 0.0 | missing | missing | missing | |
| 2987 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_221601__257 | 0 | 0.0 | 36.3165 | 0 | [1, 919] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_221601__257.json | 50.0 | missing | missing | missing | |
| 2988 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_014431__546 | 0 | 0.0 | 64.5248 | 0 | [492, 314] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_014431__546.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2989 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_014547__605 | 0 | 0.0 | 75.0892 | 0 | [492, 376] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_014547__605.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2990 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_224808__426 | 4 | 0.0 | 79.4219 | 5 | [492, 403] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_224808__426.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2991 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_221412__643 | 0 | 0.0 | 11.2138 | 0 | [1, 311] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_221412__643.json | 0.0 | missing | missing | missing | |
| 2992 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_221436__323 | 0 | 0.0 | 24.8654 | 0 | [1, 655] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_221436__323.json | 25.0 | missing | missing | missing | |
| 2993 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_014210__198 | 0 | 0.0 | 78.9252 | 0 | [490, 394] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_014210__198.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2994 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_014326__611 | 3 | 0.0 | 75.4009 | 4 | [490, 378] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_014326__611.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2995 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_224648__510 | 0 | 0.0 | 101.847 | 4 | [490, 534] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_224648__510.json | 70.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2996 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231226_225705__399 | 0 | 0.0 | 10.7213 | 0 | [156, 398] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231226_225705__399.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2997 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_104415__884 | 0 | 0.0 | 14.1265 | 0 | [156, 522] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_104415__884.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2998 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_104431__810 | 0 | 0.0 | 15.9238 | 0 | [156, 587] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_104431__810.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2999 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_104529__735 | 0 | 0.0 | 58.0708 | 0 | [156, 1904] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_104529__735.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3000 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_225654__505 | 0 | 0.0 | 10.0286 | 0 | [193, 363] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_225654__505.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3001 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_104348__706 | 0 | 0.0 | 8.95364 | 0 | [193, 323] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_104348__706.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3002 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_104356__348 | 0 | 0.0 | 7.53589 | 0 | [193, 269] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_104356__348.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3003 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_104401__485 | 0 | 0.0 | 5.2168 | 0 | [193, 181] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_104401__485.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3004 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_225644__403 | 0 | 0.0 | 10.2809 | 0 | [277, 230] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_225644__403.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3005 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_104325__361 | 0 | 0.0 | 15.158 | 0 | [277, 406] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_104325__361.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3006 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_104330__502 | 0 | 0.0 | 4.85695 | 0 | [277, 155] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_104330__502.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3007 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_104339__870 | 0 | 0.0 | 9.72986 | 0 | [277, 337] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_104339__870.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3008 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_225723__349 | 0 | 0.0 | 7.11506 | 0 | [445, 209] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_225723__349.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3009 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_104605__761 | 0 | 0.0 | 5.71987 | 0 | [445, 158] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_104605__761.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3010 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_104616__179 | 0 | 0.0 | 10.7518 | 0 | [445, 339] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_104616__179.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3011 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_104625__908 | 0 | 0.0 | 8.86537 | 0 | [445, 272] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_104625__908.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3012 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_225715__373 | 0 | 0.0 | 10.4532 | 0 | [442, 329] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_225715__373.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3013 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_104539__737 | 0 | 0.0 | 9.62227 | 0 | [442, 299] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_104539__737.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3014 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_104549__779 | 0 | 0.0 | 10.1183 | 0 | [442, 317] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_104549__779.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3015 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_104559__637 | 0 | 0.0 | 10.4792 | 0 | [442, 330] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_104559__637.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3016 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_105210__427 | 0 | 0.0 | 3.96487 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_105210__427.json | 50.0 | missing | missing | missing | |
| 3017 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemini-1.0-pro-latest | InJulia | 1SHOT | true | false | 5 | 20240217_105214__316 | 0 | 0.0 | 3.98841 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_105214__316.json | 25.0 | missing | missing | missing | |
| 3018 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_105219__747 | 0 | 0.0 | 5.13439 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_105219__747.json | 50.0 | missing | missing | missing | |
| 3019 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemini-1.0-pro-latest | InJulia | 1SHOT | false | false | 5 | 20240217_105224__156 | 0 | 0.0 | 4.66699 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_105224__156.json | 0.0 | missing | missing | missing | |
| 3020 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemini-1.0-pro-latest | InJulia | 1SHOT | true | false | 5 | 20240217_105228__741 | 0 | 0.0 | 4.18483 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_105228__741.json | 25.0 | missing | missing | missing | |
| 3021 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240217_105115__612 | 0 | 0.0 | 3.19787 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_105115__612.json | 0.0 | missing | missing | missing | |
| 3022 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240217_105119__698 | 0 | 0.0 | 3.59171 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_105119__698.json | 0.0 | missing | missing | missing | |
| 3023 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_105125__139 | 1 | 0.0 | 6.15168 | 1 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_105125__139.json | 60.0 | missing | missing | missing | |
| 3024 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_105138__932 | 0 | 0.0 | 13.3247 | 1 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_105138__932.json | 55.0 | missing | missing | missing | |
| 3025 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_105142__628 | 0 | 0.0 | 3.48513 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_105142__628.json | 50.0 | missing | missing | missing | |
| 3026 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240217_105028__169 | 0 | 0.0 | 7.43075 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_105028__169.json | 0.0 | missing | missing | missing | |
| 3027 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240217_105031__212 | 0 | 0.0 | 3.64383 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_105031__212.json | 0.0 | missing | missing | missing | |
| 3028 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240217_105035__542 | 0 | 0.0 | 3.59797 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_105035__542.json | 25.0 | missing | missing | missing | |
| 3029 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240217_105039__790 | 0 | 0.0 | 3.85034 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_105039__790.json | 0.0 | missing | missing | missing | |
| 3030 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240217_105044__473 | 0 | 0.0 | 4.83222 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_105044__473.json | 25.0 | missing | missing | missing | |
| 3031 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240217_105411__356 | 0 | 0.0 | 6.06489 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_105411__356.json | 50.0 | missing | missing | missing | |
| 3032 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240217_105415__868 | 0 | 0.0 | 4.55838 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_105415__868.json | 25.0 | missing | missing | missing | |
| 3033 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240217_105419__245 | 0 | 0.0 | 4.0467 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_105419__245.json | 50.0 | missing | missing | missing | |
| 3034 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240217_105423__535 | 0 | 0.0 | 3.5017 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_105423__535.json | 25.0 | missing | missing | missing | |
| 3035 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240217_105425__492 | 0 | 0.0 | 2.24155 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_105425__492.json | 25.0 | missing | missing | missing | |
| 3036 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20240217_105301__297 | 0 | 0.0 | 6.57767 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_105301__297.json | 0.0 | missing | missing | missing | |
| 3037 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20240217_105306__789 | 0 | 0.0 | 5.50501 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_105306__789.json | 0.0 | missing | missing | missing | |
| 3038 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_105310__995 | 0 | 0.0 | 3.69862 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_105310__995.json | 50.0 | missing | missing | missing | |
| 3039 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_105315__881 | 0 | 0.0 | 4.93675 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_105315__881.json | 50.0 | missing | missing | missing | |
| 3040 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_105321__405 | 3 | 0.0 | 5.80281 | 4 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_105321__405.json | 85.0 | missing | missing | missing | |
| 3041 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | true | 5 | 20240223_221028__995 | 1 | 0.0 | 29.3538 | 1 | [0, 445] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_221028__995.json | 60.0 | missing | missing | missing | |
| 3042 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | true | 5 | 20240223_221056__909 | 1 | 0.0 | 28.8338 | 1 | [0, 443] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_221056__909.json | 60.0 | missing | missing | missing | |
| 3043 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | true | 5 | 20240223_221121__915 | 1 | 0.0 | 24.6589 | 1 | [0, 375] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_221121__915.json | 60.0 | missing | missing | missing | |
| 3044 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | true | 5 | 20240223_221148__388 | 1 | 0.0 | 26.6965 | 1 | [0, 411] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_221148__388.json | 60.0 | missing | missing | missing | |
| 3045 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | true | 5 | 20240223_221215__238 | 1 | 0.0 | 26.998 | 1 | [0, 414] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_221215__238.json | 60.0 | missing | missing | missing | |
| 3046 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240223_220607__845 | 1 | 0.0 | 23.7496 | 1 | [0, 362] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_220607__845.json | 60.0 | missing | missing | missing | |
| 3047 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240223_220631__973 | 0 | 0.0 | 23.2703 | 5 | [0, 357] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_220631__973.json | 75.0 | missing | missing | missing | |
| 3048 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240223_220652__175 | 1 | 0.0 | 21.844 | 1 | [0, 337] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_220652__175.json | 60.0 | missing | missing | missing | |
| 3049 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240223_220717__404 | 1 | 0.0 | 24.2252 | 1 | [0, 369] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_220717__404.json | 60.0 | missing | missing | missing | |
| 3050 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240223_220743__411 | 1 | 0.0 | 25.9663 | 1 | [0, 397] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_220743__411.json | 60.0 | missing | missing | missing | |
| 3051 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240223_220206__824 | 0 | 0.0 | 30.7032 | 1 | [0, 470] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_220206__824.json | 55.0 | missing | missing | missing | |
| 3052 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240223_220229__661 | 0 | 0.0 | 22.7756 | 0 | [0, 345] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_220229__661.json | 25.0 | missing | missing | missing | |
| 3053 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240223_220250__294 | 0 | 0.0 | 20.2389 | 1 | [0, 312] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_220250__294.json | 55.0 | missing | missing | missing | |
| 3054 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240223_220320__797 | 0 | 0.0 | 29.8616 | 1 | [0, 454] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_220320__797.json | 55.0 | missing | missing | missing | |
| 3055 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240223_220347__580 | 1 | 0.0 | 26.9769 | 1 | [0, 416] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_220347__580.json | 60.0 | missing | missing | missing | |
| 3056 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_222012__379 | 1 | 0.0 | 27.5441 | 1 | [0, 413] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_222012__379.json | 60.0 | missing | missing | missing | |
| 3057 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_222042__166 | 1 | 0.0 | 29.7552 | 1 | [0, 443] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_222042__166.json | 60.0 | missing | missing | missing | |
| 3058 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240223_222109__605 | 0 | 0.0 | 27.4885 | 0 | [0, 415] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_222109__605.json | 0.0 | missing | missing | missing | |
| 3059 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_222135__757 | 1 | 0.0 | 25.3785 | 1 | [0, 380] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_222135__757.json | 60.0 | missing | missing | missing | |
| 3060 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_222205__860 | 1 | 0.0 | 30.337 | 1 | [0, 455] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_222205__860.json | 60.0 | missing | missing | missing | |
| 3061 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240223_221523__386 | 0 | 0.0 | 30.55 | 0 | [0, 456] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_221523__386.json | 0.0 | missing | missing | missing | |
| 3062 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240223_221550__571 | 1 | 0.0 | 26.649 | 1 | [0, 398] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_221550__571.json | 60.0 | missing | missing | missing | |
| 3063 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240223_221622__850 | 1 | 0.0 | 32.1951 | 1 | [0, 486] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_221622__850.json | 60.0 | missing | missing | missing | |
| 3064 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240223_221651__918 | 1 | 0.0 | 29.1677 | 1 | [0, 441] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_221651__918.json | 60.0 | missing | missing | missing | |
| 3065 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240223_221719__378 | 0 | 0.0 | 27.3404 | 0 | [0, 406] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_221719__378.json | 0.0 | missing | missing | missing | |
| 3066 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | AsIs | 1SHOT | true | true | 5 | 20231213_213558__457 | 0 | 0.000529 | 6.47626 | 1 | [140, 306] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231213_213558__457.json | 55.0 | missing | missing | missing | |
| 3067 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | AsIs | 1SHOT | true | true | 5 | 20231225_184849__963 | 1 | 0.0005275 | 5.04754 | 1 | [140, 305] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_184849__963.json | 60.0 | missing | missing | missing | |
| 3068 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | AsIs | 1SHOT | true | true | 5 | 20231225_184855__804 | 4 | 0.0006745 | 6.13615 | 5 | [140, 403] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_184855__804.json | 95.0 | missing | missing | missing | |
| 3069 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo--optim | AsIs | 1SHOT | true | true | 5 | 20231215_191510__718 | 4 | 0.0 | 7.93298 | 5 | [140, 303] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231215_191510__718.json | 95.0 | 0.5 | missing | 0.5 | |
| 3070 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231213_213551__532 | 1 | 0.000493 | 6.09521 | 1 | [143, 281] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231213_213551__532.json | 60.0 | missing | missing | missing | |
| 3071 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231225_184835__779 | 4 | 0.000562 | 5.3744 | 5 | [143, 327] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_184835__779.json | 95.0 | missing | missing | missing | |
| 3072 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231225_184844__811 | 1 | 0.0007855 | 7.67043 | 1 | [143, 476] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_184844__811.json | 60.0 | missing | missing | missing | |
| 3073 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231227_192251__585 | 4 | 0.000544 | 5.40626 | 5 | [143, 315] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_192251__585.json | 95.0 | missing | missing | missing | |
| 3074 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231227_192259__790 | 1 | 0.0006865 | 7.24179 | 1 | [143, 410] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_192259__790.json | 60.0 | missing | missing | missing | |
| 3075 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo--optim | InJulia | 1SHOT | true | true | 5 | 20231215_191502__956 | 1 | 0.0 | 6.74397 | 1 | [143, 321] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231215_191502__956.json | 60.0 | 0.5 | missing | 0.5 | |
| 3076 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_213545__971 | 1 | 0.0004415 | 5.25032 | 1 | [178, 235] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231213_213545__971.json | 60.0 | missing | missing | missing | |
| 3077 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_184826__473 | 1 | 0.000317 | 2.97399 | 1 | [178, 152] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_184826__473.json | 60.0 | missing | missing | missing | |
| 3078 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_184830__583 | 1 | 0.0004085 | 3.87583 | 1 | [178, 213] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_184830__583.json | 60.0 | missing | missing | missing | |
| 3079 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_192243__783 | 4 | 0.0006185 | 6.17253 | 5 | [178, 353] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_192243__783.json | 95.0 | missing | missing | missing | |
| 3080 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_192246__167 | 0 | 0.000296 | 3.01475 | 0 | [178, 138] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_192246__167.json | 50.0 | missing | missing | missing | |
| 3081 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_191456__698 | 1 | 0.0 | 3.09184 | 1 | [178, 132] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231215_191456__698.json | 60.0 | 0.5 | missing | 0.5 | |
| 3082 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_213540__857 | 0 | 0.000459 | 5.04955 | 0 | [255, 221] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231213_213540__857.json | 0.0 | missing | missing | missing | |
| 3083 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_184819__681 | 3 | 0.0003765 | 3.02957 | 4 | [255, 166] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_184819__681.json | 85.0 | missing | missing | missing | |
| 3084 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_184823__724 | 0 | 0.0003885 | 3.24633 | 0 | [255, 174] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_184823__724.json | 0.0 | missing | missing | missing | |
| 3085 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_192232__128 | 5 | 0.0006645 | 6.62265 | 5 | [255, 358] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_192232__128.json | 100.0 | missing | missing | missing | |
| 3086 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_192237__372 | 0 | 0.000402 | 3.71152 | 0 | [255, 183] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_192237__372.json | 0.0 | missing | missing | missing | |
| 3087 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo--optim | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231215_191452__489 | 0 | 0.0 | 4.35939 | 0 | [255, 183] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231215_191452__489.json | 0.0 | 0.5 | missing | 0.5 | |
| 3088 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231213_213608__132 | 0 | 0.0005475 | 5.22646 | 0 | [402, 231] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231213_213608__132.json | 0.0 | missing | missing | missing | |
| 3089 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_184906__218 | 0 | 0.000495 | 3.94145 | 0 | [402, 196] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_184906__218.json | 0.0 | missing | missing | missing | |
| 3090 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_184910__958 | 0 | 0.0004635 | 3.8869 | 0 | [402, 175] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_184910__958.json | 0.0 | missing | missing | missing | |
| 3091 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_192310__244 | 0 | 0.000222 | 0.511897 | 0 | [402, 14] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_192310__244.json | 0.0 | missing | missing | missing | |
| 3092 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_192310__469 | 0 | 0.000462 | 3.16037 | 0 | [402, 174] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_192310__469.json | 0.0 | missing | missing | missing | |
| 3093 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo--optim | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231215_191522__710 | 0 | 0.0 | 5.89355 | 0 | [402, 228] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231215_191522__710.json | 0.0 | 0.5 | missing | 0.5 | |
| 3094 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_213602__154 | 0 | 0.000499 | 4.43217 | 0 | [401, 199] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231213_213602__154.json | 0.0 | missing | missing | missing | |
| 3095 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_184859__223 | 0 | 0.0004855 | 3.39168 | 0 | [401, 190] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_184859__223.json | 0.0 | missing | missing | missing | |
| 3096 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_184902__651 | 0 | 0.0005095 | 3.00381 | 0 | [401, 206] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_184902__651.json | 0.0 | missing | missing | missing | |
| 3097 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_192303__436 | 0 | 0.000448 | 3.47566 | 0 | [401, 165] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_192303__436.json | 0.0 | missing | missing | missing | |
| 3098 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_192307__882 | 0 | 0.0004885 | 3.57909 | 0 | [401, 192] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_192307__882.json | 0.0 | missing | missing | missing | |
| 3099 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo--optim | JuliaRecapTask | 1SHOT | false | false | 5 | 20231215_191516__405 | 0 | 0.0 | 5.11553 | 0 | [401, 203] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231215_191516__405.json | 0.0 | 0.5 | missing | 0.5 | |
| 3100 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | AsIs | 1SHOT | true | true | 5 | 20231213_213620__715 | 4 | 0.00057 | 3.97989 | 5 | [140, 215] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231213_213620__715.json | 95.0 | missing | missing | missing | |
| 3101 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | AsIs | 1SHOT | true | true | 5 | 20231225_184928__354 | 4 | 0.000608 | 2.77812 | 5 | [140, 234] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_184928__354.json | 95.0 | missing | missing | missing | |
| 3102 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | AsIs | 1SHOT | true | true | 5 | 20231225_184931__954 | 2 | 0.00055 | 2.78851 | 5 | [140, 205] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_184931__954.json | 85.0 | missing | missing | missing | |
| 3103 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106--optim | AsIs | 1SHOT | true | true | 5 | 20231215_191540__305 | 4 | 0.0 | 4.84934 | 5 | [140, 199] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231215_191540__305.json | 95.0 | 0.9 | missing | 0.1 | |
| 3104 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231213_213616__608 | 4 | 0.000505 | 2.76754 | 5 | [143, 181] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231213_213616__608.json | 95.0 | missing | missing | missing | |
| 3105 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231225_184923__648 | 4 | 0.000601 | 3.02977 | 5 | [143, 229] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_184923__648.json | 95.0 | missing | missing | missing | |
| 3106 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231225_184926__347 | 4 | 0.000519 | 2.23239 | 5 | [143, 188] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_184926__347.json | 95.0 | missing | missing | missing | |
| 3107 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231227_192325__182 | 4 | 0.000509 | 2.91123 | 5 | [143, 183] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_192325__182.json | 95.0 | missing | missing | missing | |
| 3108 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231227_192330__905 | 2 | 0.000665 | 4.3812 | 3 | [143, 261] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_192330__905.json | 75.0 | missing | missing | missing | |
| 3109 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106--optim | InJulia | 1SHOT | true | true | 5 | 20231215_191535__889 | 4 | 0.0 | 5.93241 | 5 | [143, 258] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231215_191535__889.json | 95.0 | 0.9 | missing | 0.1 | |
| 3110 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_213613__263 | 0 | 0.00039 | 2.05028 | 5 | [178, 106] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231213_213613__263.json | 75.0 | missing | missing | missing | |
| 3111 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_184918__147 | 1 | 0.000422 | 2.04349 | 1 | [178, 122] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_184918__147.json | 60.0 | missing | missing | missing | |
| 3112 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_184920__429 | 2 | 0.000462 | 2.21569 | 3 | [178, 142] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_184920__429.json | 75.0 | missing | missing | missing | |
| 3113 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_192320__336 | 4 | 0.000428 | 2.79188 | 5 | [178, 125] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_192320__336.json | 95.0 | missing | missing | missing | |
| 3114 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_192322__714 | 4 | 0.000424 | 2.49333 | 5 | [178, 123] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_192322__714.json | 95.0 | missing | missing | missing | |
| 3115 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_191529__964 | 4 | 0.0 | 2.88752 | 5 | [178, 125] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231215_191529__964.json | 95.0 | 0.9 | missing | 0.1 | |
| 3116 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_213611__460 | 4 | 0.000547 | 3.35136 | 5 | [255, 146] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231213_213611__460.json | 95.0 | missing | missing | missing | |
| 3117 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_184912__492 | 4 | 0.000619 | 2.20912 | 5 | [255, 182] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_184912__492.json | 95.0 | missing | missing | missing | |
| 3118 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_184915__190 | 5 | 0.000649 | 2.70662 | 5 | [255, 197] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_184915__190.json | 100.0 | missing | missing | missing | |
| 3119 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_192313__881 | 4 | 0.000601 | 3.24707 | 5 | [255, 173] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_192313__881.json | 95.0 | missing | missing | missing | |
| 3120 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_192317__907 | 4 | 0.000581 | 3.24372 | 5 | [255, 163] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_192317__907.json | 95.0 | missing | missing | missing | |
| 3121 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_191526__964 | 4 | 0.0 | 3.53639 | 5 | [255, 164] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231215_191526__964.json | 95.0 | 0.9 | missing | 0.1 | |
| 3122 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231213_213626__878 | 0 | 0.000694 | 2.68838 | 0 | [402, 146] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231213_213626__878.json | 0.0 | missing | missing | missing | |
| 3123 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_184936__276 | 0 | 0.000576 | 1.23326 | 0 | [402, 87] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_184936__276.json | 0.0 | missing | missing | missing | |
| 3124 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_184938__713 | 0 | 0.000888 | 2.8239 | 0 | [402, 243] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_184938__713.json | 0.0 | missing | missing | missing | |
| 3125 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_192336__836 | 0 | 0.000586 | 1.98434 | 0 | [402, 92] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_192336__836.json | 0.0 | missing | missing | missing | |
| 3126 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_192338__258 | 0 | 0.00056 | 2.04999 | 0 | [402, 79] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_192338__258.json | 0.0 | missing | missing | missing | |
| 3127 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106--optim | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231215_191549__808 | 0 | 0.0 | 3.35596 | 0 | [402, 114] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231215_191549__808.json | 0.0 | 0.9 | missing | 0.1 | |
| 3128 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_213623__133 | 3 | 0.000673 | 2.51861 | 5 | [401, 136] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231213_213623__133.json | 90.0 | missing | missing | missing | |
| 3129 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_184933__195 | 0 | 0.000581 | 1.49756 | 0 | [401, 90] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_184933__195.json | 0.0 | missing | missing | missing | |
| 3130 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_184934__351 | 0 | 0.000557 | 1.34771 | 0 | [401, 78] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_184934__351.json | 0.0 | missing | missing | missing | |
| 3131 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_192332__279 | 0 | 0.000587 | 1.82587 | 0 | [401, 93] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_192332__279.json | 0.0 | missing | missing | missing | |
| 3132 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_192334__746 | 0 | 0.000683 | 2.1177 | 0 | [401, 141] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_192334__746.json | 50.0 | missing | missing | missing | |
| 3133 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-1106--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_191546__510 | 3 | 0.0 | 5.65706 | 4 | [401, 144] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231215_191546__510.json | 85.0 | 0.9 | missing | 0.1 | |
| 3134 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | AsIs | 1SHOT | true | true | 5 | 20231213_213759__811 | 5 | 0.01511 | 38.8744 | 5 | [140, 457] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231213_213759__811.json | 100.0 | missing | missing | missing | |
| 3135 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | AsIs | 1SHOT | true | true | 5 | 20231225_185209__235 | 4 | 0.01403 | 50.6093 | 4 | [140, 421] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_185209__235.json | 90.0 | missing | missing | missing | |
| 3136 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | AsIs | 1SHOT | true | true | 5 | 20231225_185226__994 | 5 | 0.01469 | 16.4029 | 5 | [140, 443] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_185226__994.json | 100.0 | missing | missing | missing | |
| 3137 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview--optim | AsIs | 1SHOT | true | true | 5 | 20231215_191730__894 | 5 | 0.0 | 39.766 | 5 | [140, 305] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231215_191730__894.json | 100.0 | 0.1 | missing | 0.9 | |
| 3138 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231213_213719__285 | 5 | 0.01229 | 28.8036 | 5 | [143, 362] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231213_213719__285.json | 100.0 | missing | missing | missing | |
| 3139 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231225_185100__578 | 5 | 0.01337 | 22.6689 | 5 | [143, 398] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_185100__578.json | 100.0 | missing | missing | missing | |
| 3140 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231225_185117__885 | 4 | 0.01616 | 17.0238 | 4 | [143, 491] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_185117__885.json | 90.0 | missing | missing | missing | |
| 3141 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231227_192537__202 | 5 | 0.01322 | 25.9268 | 5 | [143, 393] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_192537__202.json | 100.0 | missing | missing | missing | |
| 3142 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231227_192650__592 | 4 | 0.01787 | 72.6439 | 4 | [143, 548] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_192650__592.json | 90.0 | missing | missing | missing | |
| 3143 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview--optim | InJulia | 1SHOT | true | true | 5 | 20231215_191649__872 | 5 | 0.0 | 34.2997 | 5 | [143, 374] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231215_191649__872.json | 100.0 | 0.1 | missing | 0.9 | |
| 3144 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_213649__829 | 5 | 0.00721 | 15.1358 | 5 | [178, 181] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231213_213649__829.json | 100.0 | missing | missing | missing | |
| 3145 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_185027__121 | 4 | 0.00883 | 9.53733 | 4 | [178, 235] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_185027__121.json | 90.0 | missing | missing | missing | |
| 3146 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_185037__869 | 5 | 0.00601 | 8.92441 | 5 | [178, 141] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_185037__869.json | 100.0 | missing | missing | missing | |
| 3147 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_192448__594 | 5 | 0.0082 | 11.1552 | 5 | [178, 214] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_192448__594.json | 100.0 | missing | missing | missing | |
| 3148 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_192510__854 | 5 | 0.00721 | 21.737 | 5 | [178, 181] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_192510__854.json | 100.0 | missing | missing | missing | |
| 3149 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_191614__925 | 5 | 0.0 | 10.5531 | 5 | [178, 139] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231215_191614__925.json | 100.0 | 0.1 | missing | 0.9 | |
| 3150 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_213634__621 | 0 | 0.00648 | 8.35109 | 0 | [255, 131] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231213_213634__621.json | 0.0 | missing | missing | missing | |
| 3151 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_184956__636 | 5 | 0.015 | 17.8134 | 5 | [255, 415] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_184956__636.json | 100.0 | missing | missing | missing | |
| 3152 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_185017__447 | 4 | 0.01401 | 20.893 | 5 | [255, 382] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_185017__447.json | 95.0 | missing | missing | missing | |
| 3153 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_192411__325 | 5 | 0.01218 | 32.5662 | 5 | [255, 321] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_192411__325.json | 100.0 | missing | missing | missing | |
| 3154 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_192437__410 | 4 | 0.01593 | 25.2508 | 5 | [255, 446] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_192437__410.json | 95.0 | missing | missing | missing | |
| 3155 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview--optim | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231215_191604__828 | 0 | 0.0 | 14.3881 | 0 | [255, 168] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231215_191604__828.json | 0.0 | 0.1 | missing | 0.9 | |
| 3156 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_213911__878 | 4 | 0.01731 | 30.106 | 5 | [402, 443] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231213_213911__878.json | 95.0 | missing | missing | missing | |
| 3157 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_185422__926 | 5 | 0.0159 | 16.9411 | 5 | [402, 396] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_185422__926.json | 100.0 | missing | missing | missing | |
| 3158 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_185445__262 | 0 | 0.01773 | 22.9867 | 0 | [402, 457] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_185445__262.json | 25.0 | missing | missing | missing | |
| 3159 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_192855__522 | 5 | 0.02235 | 64.4627 | 5 | [402, 611] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_192855__522.json | 100.0 | missing | missing | missing | |
| 3160 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_192914__491 | 5 | 0.01398 | 19.2989 | 5 | [402, 332] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_192914__491.json | 100.0 | missing | missing | missing | |
| 3161 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_191915__117 | 4 | 0.0 | 61.7725 | 4 | [402, 558] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231215_191915__117.json | 90.0 | 0.1 | missing | 0.9 | |
| 3162 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_213841__125 | 5 | 0.02108 | 41.1693 | 5 | [401, 569] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231213_213841__125.json | 100.0 | missing | missing | missing | |
| 3163 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_185335__730 | 4 | 0.01955 | 67.9655 | 5 | [401, 518] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_185335__730.json | 95.0 | missing | missing | missing | |
| 3164 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_185404__908 | 5 | 0.01817 | 28.5683 | 5 | [401, 472] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_185404__908.json | 100.0 | missing | missing | missing | |
| 3165 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_192734__438 | 0 | 0.02573 | 43.5459 | 0 | [401, 724] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_192734__438.json | 50.0 | missing | missing | missing | |
| 3166 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_192750__474 | 0 | 0.01034 | 16.1261 | 0 | [401, 211] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_192750__474.json | 0.0 | missing | missing | missing | |
| 3167 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-1106-preview--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_191813__461 | 5 | 0.0 | 42.013 | 5 | [401, 441] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231215_191813__461.json | 100.0 | 0.1 | missing | 0.9 | |
| 3168 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | AsIs | 1SHOT | false | false | 5 | 20231213_235210__796 | 0 | 0.0 | 19.5268 | 0 | [138, 544] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__AsIs__1SHOT__20231213_235210__796.json | 0.0 | missing | missing | missing | |
| 3169 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | AsIs | 1SHOT | false | false | 5 | 20231225_004400__574 | 0 | 0.0 | 22.4375 | 0 | [138, 622] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__AsIs__1SHOT__20231225_004400__574.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3170 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | AsIs | 1SHOT | false | false | 5 | 20231225_004422__298 | 0 | 0.0 | 21.4592 | 0 | [1, 623] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__AsIs__1SHOT__20231225_004422__298.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3171 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | InJulia | 1SHOT | true | true | 5 | 20231213_235150__112 | 0 | 0.0 | 19.786 | 0 | [155, 549] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__InJulia__1SHOT__20231213_235150__112.json | 50.0 | missing | missing | missing | |
| 3172 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | InJulia | 1SHOT | true | true | 5 | 20231225_004306__666 | 0 | 0.0 | 21.8207 | 0 | [155, 604] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__InJulia__1SHOT__20231225_004306__666.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3173 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | InJulia | 1SHOT | true | true | 5 | 20231225_004338__466 | 0 | 0.0 | 31.2168 | 0 | [1, 870] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__InJulia__1SHOT__20231225_004338__466.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3174 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | InJulia | 1SHOT | false | false | 5 | 20231226_222057__326 | 0 | 0.0 | 30.0519 | 0 | [155, 825] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__InJulia__1SHOT__20231226_222057__326.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3175 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231213_235131__651 | 0 | 0.0 | 16.2652 | 0 | [184, 442] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaExpertAsk__1SHOT__20231213_235131__651.json | 0.0 | missing | missing | missing | |
| 3176 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_004227__396 | 1 | 0.0 | 21.3075 | 1 | [184, 576] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_004227__396.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3177 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_004245__378 | 0 | 0.0 | 17.4589 | 0 | [1, 509] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_004245__378.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3178 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_222027__999 | 1 | 0.0 | 20.0148 | 5 | [184, 550] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaExpertAsk__1SHOT__20231226_222027__999.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3179 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_235114__377 | 0 | 0.0 | 30.6795 | 0 | [280, 772] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231213_235114__377.json | 50.0 | missing | missing | missing | |
| 3180 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_004134__852 | 0 | 0.0 | 43.374 | 0 | [298, 945] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_004134__852.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3181 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_004205__409 | 0 | 0.0 | 31.0204 | 5 | [1, 834] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_004205__409.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3182 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_222006__796 | 0 | 0.0 | 31.5484 | 0 | [298, 679] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_222006__796.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3183 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231213_235252__665 | 0 | 0.0 | 14.2854 | 0 | [11, 384] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231213_235252__665.json | 0.0 | missing | missing | missing | |
| 3184 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_004549__744 | 0 | 0.0 | 22.7765 | 0 | [11, 597] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_004549__744.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3185 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_004600__952 | 0 | 0.0 | 10.5604 | 0 | [1, 293] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_004600__952.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3186 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_222147__482 | 0 | 0.0 | 19.1878 | 4 | [11, 515] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_222147__482.json | 70.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3187 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_235238__344 | 0 | 0.0 | 27.6486 | 0 | [455, 612] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaRecapTask__1SHOT__20231213_235238__344.json | 50.0 | missing | missing | missing | |
| 3188 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_004453__964 | 0 | 0.0 | 31.7742 | 0 | [455, 710] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_004453__964.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3189 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_004526__961 | 0 | 0.0 | 32.5024 | 5 | [1, 832] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_004526__961.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3190 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_222128__364 | 0 | 0.0 | 30.859 | 0 | [455, 697] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaRecapTask__1SHOT__20231226_222128__364.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3191 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | AsIs | 1SHOT | false | false | 5 | 20231214_000522__606 | 0 | 0.0 | 20.2319 | 0 | [138, 564] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__AsIs__1SHOT__20231214_000522__606.json | 0.0 | missing | missing | missing | |
| 3192 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | AsIs | 1SHOT | false | false | 5 | 20231225_011349__642 | 0 | 0.0 | 11.0086 | 0 | [152, 346] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__AsIs__1SHOT__20231225_011349__642.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3193 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | AsIs | 1SHOT | true | false | 5 | 20231225_011358__269 | 0 | 0.0 | 8.90593 | 0 | [152, 278] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__AsIs__1SHOT__20231225_011358__269.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3194 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | InJulia | 1SHOT | true | true | 5 | 20231214_000501__874 | 0 | 0.0 | 22.0551 | 0 | [155, 610] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__InJulia__1SHOT__20231214_000501__874.json | 50.0 | missing | missing | missing | |
| 3195 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_011327__487 | 1 | 0.0 | 10.2432 | 5 | [155, 322] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__InJulia__1SHOT__20231225_011327__487.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3196 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_011338__600 | 3 | 0.0 | 11.1629 | 5 | [155, 352] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__InJulia__1SHOT__20231225_011338__600.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3197 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | InJulia | 1SHOT | true | true | 5 | 20231226_223209__713 | 1 | 0.0 | 13.97 | 5 | [155, 443] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__InJulia__1SHOT__20231226_223209__713.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3198 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_000439__595 | 0 | 0.0 | 15.7073 | 0 | [184, 426] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231214_000439__595.json | 50.0 | missing | missing | missing | |
| 3199 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_011308__125 | 0 | 0.0 | 11.6662 | 0 | [194, 358] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_011308__125.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3200 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_011317__260 | 0 | 0.0 | 8.34874 | 0 | [194, 249] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_011317__260.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3201 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_223155__713 | 3 | 0.0 | 12.9277 | 5 | [194, 399] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231226_223155__713.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3202 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_000423__706 | 0 | 0.0 | 19.893 | 0 | [280, 499] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231214_000423__706.json | 0.0 | missing | missing | missing | |
| 3203 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_011243__119 | 0 | 0.0 | 18.5083 | 5 | [290, 366] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_011243__119.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3204 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_011256__748 | 3 | 0.0 | 13.202 | 5 | [290, 387] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_011256__748.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3205 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_223142__499 | 0 | 0.0 | 18.6142 | 0 | [290, 377] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231226_223142__499.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3206 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_000617__576 | 0 | 0.0 | 16.8756 | 0 | [11, 450] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231214_000617__576.json | 50.0 | missing | missing | missing | |
| 3207 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_011437__581 | 3 | 0.0 | 13.553 | 5 | [458, 366] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_011437__581.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3208 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_011450__105 | 3 | 0.0 | 12.6229 | 5 | [458, 338] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_011450__105.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3209 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_223237__834 | 1 | 0.0 | 16.3832 | 5 | [458, 453] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231226_223237__834.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3210 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_000600__234 | 0 | 0.0 | 38.225 | 0 | [455, 859] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaRecapTask__1SHOT__20231214_000600__234.json | 50.0 | missing | missing | missing | |
| 3211 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_011411__736 | 3 | 0.0 | 12.3947 | 5 | [455, 331] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_011411__736.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3212 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_011424__237 | 0 | 0.0 | 12.8764 | 0 | [455, 345] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_011424__237.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3213 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_223220__349 | 3 | 0.0 | 10.8898 | 5 | [455, 284] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaRecapTask__1SHOT__20231226_223220__349.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3214 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_175508__517 | 4 | 0.0 | 17.7621 | 5 | [155, 336] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_175508__517.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3215 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_175526__836 | 3 | 0.0 | 17.3546 | 5 | [155, 328] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_175526__836.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3216 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_175545__220 | 4 | 0.0 | 19.0476 | 5 | [155, 361] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_175545__220.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3217 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_175409__234 | 3 | 0.0 | 14.9109 | 5 | [194, 273] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_175409__234.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3218 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_175431__218 | 0 | 0.0 | 21.5599 | 0 | [194, 402] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_175431__218.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3219 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_175450__899 | 3 | 0.0 | 19.521 | 5 | [194, 363] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_175450__899.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3220 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_175308__703 | 0 | 0.0 | 23.2991 | 5 | [290, 422] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_175308__703.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3221 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_175333__489 | 0 | 0.0 | 24.9473 | 5 | [290, 453] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_175333__489.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3222 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_175354__767 | 0 | 0.0 | 20.9069 | 5 | [290, 376] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_175354__767.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3223 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_175713__593 | 3 | 0.0 | 24.2353 | 5 | [458, 416] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_175713__593.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3224 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_175726__496 | 3 | 0.0 | 12.3569 | 5 | [458, 191] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_175726__496.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3225 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_175737__918 | 3 | 0.0 | 11.1968 | 5 | [458, 169] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_175737__918.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3226 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_175608__761 | 3 | 0.0 | 22.6319 | 5 | [455, 386] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_175608__761.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3227 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_175628__751 | 3 | 0.0 | 19.9783 | 5 | [455, 336] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_175628__751.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3228 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_175649__155 | 4 | 0.0 | 21.2001 | 5 | [455, 359] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_175649__155.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3229 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | AsIs | 1SHOT | true | true | 5 | 20231213_214419__317 | 3 | 0.00398887 | 41.9486 | 5 | [150, 443] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__AsIs__1SHOT__20231213_214419__317.json | 90.0 | missing | missing | missing | |
| 3230 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | AsIs | 1SHOT | true | true | 5 | 20231225_185845__800 | 4 | 0.00345493 | 8.47548 | 5 | [150, 377] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__AsIs__1SHOT__20231225_185845__800.json | 95.0 | missing | missing | missing | |
| 3231 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | AsIs | 1SHOT | true | true | 5 | 20231225_185900__858 | 4 | 0.00583339 | 15.116 | 5 | [150, 671] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__AsIs__1SHOT__20231225_185900__858.json | 95.0 | missing | missing | missing | |
| 3232 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium--optim | AsIs | 1SHOT | true | true | 5 | 20231215_192135__917 | 3 | 0.0 | 7.25179 | 5 | [150, 327] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__AsIs__1SHOT__20231215_192135__917.json | 90.0 | 0.9 | missing | 0.3 | |
| 3233 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231213_214337__355 | 5 | 0.00359247 | 40.8551 | 5 | [153, 393] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__InJulia__1SHOT__20231213_214337__355.json | 100.0 | missing | missing | missing | |
| 3234 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231225_185826__525 | 4 | 0.00382708 | 18.7139 | 5 | [153, 422] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__InJulia__1SHOT__20231225_185826__525.json | 95.0 | missing | missing | missing | |
| 3235 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231225_185837__867 | 4 | 0.00423967 | 10.5539 | 5 | [153, 473] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__InJulia__1SHOT__20231225_185837__867.json | 95.0 | missing | missing | missing | |
| 3236 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231227_193258__308 | 4 | 0.00428821 | 21.9798 | 5 | [153, 479] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__InJulia__1SHOT__20231227_193258__308.json | 95.0 | missing | missing | missing | |
| 3237 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231227_193320__490 | 2 | 0.00440956 | 21.9899 | 5 | [153, 494] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__InJulia__1SHOT__20231227_193320__490.json | 85.0 | missing | missing | missing | |
| 3238 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium--optim | InJulia | 1SHOT | true | true | 5 | 20231215_192128__965 | 3 | 0.0 | 9.06894 | 5 | [153, 409] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__InJulia__1SHOT__20231215_192128__965.json | 90.0 | 0.9 | missing | 0.3 | |
| 3239 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_214256__132 | 3 | 0.00289686 | 42.3103 | 5 | [192, 294] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231213_214256__132.json | 90.0 | missing | missing | missing | |
| 3240 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_185746__282 | 4 | 0.00272697 | 15.2435 | 5 | [192, 273] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_185746__282.json | 95.0 | missing | missing | missing | |
| 3241 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_185807__874 | 3 | 0.00315574 | 20.322 | 5 | [192, 326] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_185807__874.json | 90.0 | missing | missing | missing | |
| 3242 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_193215__216 | 3 | 0.00351979 | 23.1179 | 5 | [192, 371] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_193215__216.json | 90.0 | missing | missing | missing | |
| 3243 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_193236__160 | 3 | 0.0036735 | 20.5952 | 5 | [192, 390] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_193236__160.json | 90.0 | missing | missing | missing | |
| 3244 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_192118__734 | 4 | 0.0 | 6.72049 | 5 | [192, 299] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231215_192118__734.json | 95.0 | 0.9 | missing | 0.3 | |
| 3245 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_215205__246 | 4 | 0.00467698 | 76.3694 | 5 | [288, 482] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231213_215205__246.json | 95.0 | missing | missing | missing | |
| 3246 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_185717__756 | 4 | 0.00341494 | 8.49533 | 5 | [288, 326] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_185717__756.json | 95.0 | missing | missing | missing | |
| 3247 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_185731__532 | 4 | 0.00565587 | 13.641 | 5 | [288, 603] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_185731__532.json | 95.0 | missing | missing | missing | |
| 3248 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_193138__691 | 4 | 0.00469316 | 11.2645 | 5 | [288, 484] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_193138__691.json | 95.0 | missing | missing | missing | |
| 3249 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_193152__167 | 2 | 0.0049844 | 14.0629 | 5 | [288, 520] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_193152__167.json | 85.0 | missing | missing | missing | |
| 3250 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_192111__914 | 4 | 0.0 | 61.6537 | 5 | [288, 520] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231215_192111__914.json | 95.0 | 0.9 | missing | 0.3 | |
| 3251 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_214546__920 | 3 | 0.00385775 | 29.4978 | 4 | [455, 325] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231213_214546__920.json | 85.0 | missing | missing | missing | |
| 3252 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_185951__168 | 0 | 0.00614722 | 14.1367 | 0 | [455, 608] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_185951__168.json | 50.0 | missing | missing | missing | |
| 3253 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_190023__942 | 0 | 0.00468293 | 31.3679 | 0 | [455, 427] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_190023__942.json | 50.0 | missing | missing | missing | |
| 3254 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_193430__196 | 0 | 0.00514406 | 24.5764 | 4 | [455, 484] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_193430__196.json | 70.0 | missing | missing | missing | |
| 3255 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_193517__850 | 1 | 0.00585598 | 47.0716 | 4 | [455, 572] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_193517__850.json | 75.0 | missing | missing | missing | |
| 3256 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_192352__194 | 0 | 0.0 | 76.1042 | 2 | [455, 530] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231215_192352__194.json | 60.0 | 0.9 | missing | 0.3 | |
| 3257 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_214516__639 | 2 | 0.00608249 | 57.1758 | 4 | [452, 601] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231213_214516__639.json | 80.0 | missing | missing | missing | |
| 3258 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_185923__273 | 3 | 0.00487708 | 21.1375 | 4 | [452, 452] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_185923__273.json | 85.0 | missing | missing | missing | |
| 3259 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_185937__900 | 1 | 0.00481236 | 13.3095 | 4 | [452, 444] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_185937__900.json | 75.0 | missing | missing | missing | |
| 3260 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_193334__375 | 4 | 0.00499034 | 13.6803 | 4 | [452, 466] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_193334__375.json | 90.0 | missing | missing | missing | |
| 3261 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_193405__362 | 3 | 0.00579125 | 30.2613 | 4 | [452, 565] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_193405__362.json | 85.0 | missing | missing | missing | |
| 3262 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-medium--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_192235__784 | 2 | 0.0 | 60.377 | 4 | [452, 398] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231215_192235__784.json | 80.0 | 0.9 | missing | 0.3 | |
| 3263 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | AsIs | 1SHOT | true | true | 5 | 20231213_214013__598 | 1 | 0.00115371 | 7.36961 | 1 | [155, 543] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__AsIs__1SHOT__20231213_214013__598.json | 60.0 | missing | missing | missing | |
| 3264 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | AsIs | 1SHOT | true | false | 5 | 20231225_185617__265 | 0 | 0.000829725 | 6.16402 | 0 | [155, 376] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__AsIs__1SHOT__20231225_185617__265.json | 25.0 | missing | missing | missing | |
| 3265 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | AsIs | 1SHOT | false | false | 5 | 20231225_185619__153 | 0 | 0.000208925 | 1.15765 | 0 | [155, 56] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__AsIs__1SHOT__20231225_185619__153.json | 0.0 | missing | missing | missing | |
| 3266 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small--optim | AsIs | 1SHOT | true | false | 5 | 20231215_191956__642 | 0 | 0.0 | 5.47849 | 0 | [155, 409] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__AsIs__1SHOT__20231215_191956__642.json | 25.0 | 0.9 | missing | 0.3 | |
| 3267 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231213_214005__679 | 3 | 0.000967466 | 6.0363 | 5 | [158, 446] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__InJulia__1SHOT__20231213_214005__679.json | 90.0 | missing | missing | missing | |
| 3268 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231225_185606__734 | 1 | 0.000690046 | 4.28168 | 1 | [158, 303] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__InJulia__1SHOT__20231225_185606__734.json | 60.0 | missing | missing | missing | |
| 3269 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231225_185611__176 | 0 | 0.000765706 | 4.68684 | 0 | [158, 342] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__InJulia__1SHOT__20231225_185611__176.json | 50.0 | missing | missing | missing | |
| 3270 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231227_193024__971 | 4 | 0.000713326 | 5.73129 | 5 | [158, 315] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__InJulia__1SHOT__20231227_193024__971.json | 95.0 | missing | missing | missing | |
| 3271 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231227_193029__409 | 0 | 0.000773466 | 4.68399 | 0 | [158, 346] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__InJulia__1SHOT__20231227_193029__409.json | 50.0 | missing | missing | missing | |
| 3272 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small--optim | InJulia | 1SHOT | true | true | 5 | 20231215_191951__668 | 4 | 0.0 | 4.55772 | 5 | [158, 343] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__InJulia__1SHOT__20231215_191951__668.json | 95.0 | 0.9 | missing | 0.3 | |
| 3273 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_213959__557 | 0 | 0.000720453 | 4.18666 | 0 | [199, 305] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231213_213959__557.json | 50.0 | missing | missing | missing | |
| 3274 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_185558__753 | 0 | 0.000856253 | 5.21077 | 0 | [199, 375] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_185558__753.json | 50.0 | missing | missing | missing | |
| 3275 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_185602__482 | 0 | 0.000576893 | 3.28535 | 0 | [199, 231] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_185602__482.json | 50.0 | missing | missing | missing | |
| 3276 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_193012__976 | 0 | 0.000646733 | 4.03952 | 0 | [199, 267] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_193012__976.json | 50.0 | missing | missing | missing | |
| 3277 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_193018__563 | 0 | 0.000701053 | 6.11434 | 0 | [199, 295] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_193018__563.json | 50.0 | missing | missing | missing | |
| 3278 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_191946__165 | 0 | 0.0 | 3.91959 | 0 | [199, 295] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231215_191946__165.json | 50.0 | 0.9 | missing | 0.3 | |
| 3279 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_213955__812 | 0 | 0.000780625 | 6.07615 | 0 | [295, 304] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231213_213955__812.json | 0.0 | missing | missing | missing | |
| 3280 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_185547__802 | 4 | 0.0010076 | 5.74302 | 5 | [295, 421] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_185547__802.json | 95.0 | missing | missing | missing | |
| 3281 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_185553__814 | 0 | 0.00103089 | 6.03781 | 0 | [295, 433] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_185553__814.json | 0.0 | missing | missing | missing | |
| 3282 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_193004__727 | 0 | 0.000355765 | 1.52817 | 0 | [295, 85] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_193004__727.json | 0.0 | missing | missing | missing | |
| 3283 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_193008__450 | 0 | 0.000778685 | 4.18606 | 0 | [295, 303] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_193008__450.json | 0.0 | missing | missing | missing | |
| 3284 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_191942__415 | 4 | 0.0 | 6.01642 | 5 | [295, 453] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231215_191942__415.json | 95.0 | 0.9 | missing | 0.3 | |
| 3285 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_214033__400 | 0 | 0.00143187 | 8.12367 | 1 | [465, 583] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231213_214033__400.json | 55.0 | missing | missing | missing | |
| 3286 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_185656__121 | 1 | 0.00132518 | 7.22204 | 1 | [465, 528] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_185656__121.json | 60.0 | missing | missing | missing | |
| 3287 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_185708__374 | 1 | 0.0020197 | 12.0336 | 1 | [465, 886] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_185708__374.json | 60.0 | missing | missing | missing | |
| 3288 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_193107__953 | 1 | 0.00244844 | 16.6921 | 4 | [465, 1107] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_193107__953.json | 75.0 | missing | missing | missing | |
| 3289 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_193126__291 | 1 | 0.00294702 | 18.9613 | 1 | [465, 1364] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_193126__291.json | 60.0 | missing | missing | missing | |
| 3290 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small--optim | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231215_192010__821 | 0 | 0.0 | 4.13563 | 0 | [465, 304] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231215_192010__821.json | 0.0 | 0.9 | missing | 0.3 | |
| 3291 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_214025__304 | 2 | 0.00196214 | 11.9099 | 5 | [463, 857] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231213_214025__304.json | 85.0 | missing | missing | missing | |
| 3292 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_185628__518 | 1 | 0.00158966 | 9.14786 | 1 | [463, 665] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_185628__518.json | 60.0 | missing | missing | missing | |
| 3293 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_185648__664 | 4 | 0.00292438 | 20.4006 | 5 | [463, 1353] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_185648__664.json | 95.0 | missing | missing | missing | |
| 3294 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_193038__512 | 1 | 0.00151788 | 8.65532 | 1 | [463, 628] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_193038__512.json | 60.0 | missing | missing | missing | |
| 3295 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_193050__149 | 3 | 0.0020475 | 12.3505 | 4 | [463, 901] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_193050__149.json | 85.0 | missing | missing | missing | |
| 3296 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-small--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_192005__486 | 1 | 0.0 | 8.65626 | 1 | [463, 628] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231215_192005__486.json | 60.0 | 0.9 | missing | 0.3 | |
| 3297 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231213_213932__148 | 0 | 0.000246841 | 7.25553 | 0 | [155, 497] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__AsIs__1SHOT__20231213_213932__148.json | 0.0 | missing | missing | missing | |
| 3298 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231225_185515__663 | 0 | 0.00025273 | 4.45309 | 0 | [155, 510] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__AsIs__1SHOT__20231225_185515__663.json | 0.0 | missing | missing | missing | |
| 3299 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231225_185519__539 | 0 | 0.00025273 | 4.44133 | 0 | [155, 510] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__AsIs__1SHOT__20231225_185519__539.json | 0.0 | missing | missing | missing | |
| 3300 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny--optim | AsIs | 1SHOT | false | false | 5 | 20231215_191928__186 | 0 | 0.0 | 2.93687 | 0 | [155, 350] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__AsIs__1SHOT__20231215_191928__186.json | 0.0 | 0.9 | missing | 0.3 | |
| 3301 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231213_213925__189 | 1 | 0.000174328 | 5.07528 | 1 | [158, 336] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__InJulia__1SHOT__20231213_213925__189.json | 60.0 | missing | missing | missing | |
| 3302 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231225_185507__129 | 1 | 0.000210115 | 3.58526 | 1 | [158, 415] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__InJulia__1SHOT__20231225_185507__129.json | 60.0 | missing | missing | missing | |
| 3303 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231225_185510__809 | 5 | 0.000177952 | 3.11554 | 5 | [158, 344] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__InJulia__1SHOT__20231225_185510__809.json | 100.0 | missing | missing | missing | |
| 3304 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231227_192935__146 | 2 | 0.000182482 | 3.07909 | 3 | [158, 354] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__InJulia__1SHOT__20231227_192935__146.json | 75.0 | missing | missing | missing | |
| 3305 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231227_192939__267 | 1 | 0.000237748 | 4.21698 | 1 | [158, 476] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__InJulia__1SHOT__20231227_192939__267.json | 60.0 | missing | missing | missing | |
| 3306 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny--optim | InJulia | 1SHOT | false | false | 5 | 20231215_191925__704 | 0 | 0.0 | 3.19072 | 0 | [158, 375] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__InJulia__1SHOT__20231215_191925__704.json | 0.0 | 0.9 | missing | 0.3 | |
| 3307 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_213920__503 | 0 | 0.00010487 | 3.20386 | 0 | [199, 170] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231213_213920__503.json | 50.0 | missing | missing | missing | |
| 3308 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_185501__253 | 0 | 0.000111665 | 1.761 | 0 | [199, 185] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_185501__253.json | 50.0 | missing | missing | missing | |
| 3309 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_185503__138 | 1 | 9.5357e-5 | 1.85674 | 1 | [199, 149] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_185503__138.json | 60.0 | missing | missing | missing | |
| 3310 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_192929__811 | 0 | 0.000110759 | 1.80355 | 0 | [199, 183] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_192929__811.json | 0.0 | missing | missing | missing | |
| 3311 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_192932__566 | 0 | 0.000157871 | 2.5492 | 0 | [199, 287] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_192932__566.json | 50.0 | missing | missing | missing | |
| 3312 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_191921__312 | 0 | 0.0 | 1.53268 | 0 | [199, 151] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231215_191921__312.json | 50.0 | 0.9 | missing | 0.3 | |
| 3313 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_213916__856 | 0 | 0.000191696 | 4.78932 | 0 | [295, 332] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231213_213916__856.json | 0.0 | missing | missing | missing | |
| 3314 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_185455__675 | 4 | 0.000292262 | 9.42163 | 5 | [295, 554] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_185455__675.json | 95.0 | missing | missing | missing | |
| 3315 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_185459__727 | 0 | 0.000251039 | 4.1574 | 0 | [295, 463] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_185459__727.json | 0.0 | missing | missing | missing | |
| 3316 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_192923__687 | 0 | 0.000195773 | 8.94399 | 0 | [295, 341] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_192923__687.json | 50.0 | missing | missing | missing | |
| 3317 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_192928__182 | 0 | 0.000242885 | 4.14388 | 0 | [295, 445] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_192928__182.json | 50.0 | missing | missing | missing | |
| 3318 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_191920__682 | 0 | 0.0 | 4.9344 | 0 | [295, 364] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231215_191920__682.json | 50.0 | 0.9 | missing | 0.3 | |
| 3319 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_213949__104 | 1 | 0.000308814 | 6.54524 | 1 | [465, 538] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231213_213949__104.json | 60.0 | missing | missing | missing | |
| 3320 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_185537__263 | 1 | 0.000212325 | 2.95704 | 1 | [465, 325] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_185537__263.json | 60.0 | missing | missing | missing | |
| 3321 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_185541__306 | 1 | 0.00022818 | 3.24631 | 1 | [465, 360] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_185541__306.json | 60.0 | missing | missing | missing | |
| 3322 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_192957__844 | 4 | 0.000271668 | 4.30338 | 5 | [465, 456] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_192957__844.json | 95.0 | missing | missing | missing | |
| 3323 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_193002__138 | 1 | 0.000284805 | 4.62743 | 1 | [465, 485] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_193002__138.json | 60.0 | missing | missing | missing | |
| 3324 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_191936__522 | 4 | 0.0 | 3.73484 | 5 | [465, 436] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231215_191936__522.json | 95.0 | 0.9 | missing | 0.3 | |
| 3325 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_213942__946 | 4 | 0.000313517 | 9.48031 | 5 | [463, 549] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231213_213942__946.json | 95.0 | missing | missing | missing | |
| 3326 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_185527__572 | 4 | 0.000237866 | 7.97895 | 5 | [463, 382] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_185527__572.json | 95.0 | missing | missing | missing | |
| 3327 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_185534__463 | 0 | 0.00040457 | 6.65102 | 5 | [463, 750] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_185534__463.json | 75.0 | missing | missing | missing | |
| 3328 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_192948__507 | 0 | 0.000207062 | 8.44612 | 0 | [463, 314] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_192948__507.json | 0.0 | missing | missing | missing | |
| 3329 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_192953__854 | 1 | 0.000264593 | 4.28545 | 1 | [463, 441] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_192953__854.json | 60.0 | missing | missing | missing | |
| 3330 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral-tiny--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_191932__381 | 0 | 0.0 | 4.21318 | 5 | [463, 488] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231215_191932__381.json | 75.0 | 0.9 | missing | 0.3 | |
| 3331 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | true | false | 5 | 20231219_222547__214 | 0 | 0.0 | 25.5926 | 0 | [138, 706] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_222547__214.json | 25.0 | missing | missing | missing | |
| 3332 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_222611__900 | 0 | 0.0 | 24.4944 | 0 | [1, 704] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_222611__900.json | 0.0 | missing | missing | missing | |
| 3333 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_222634__743 | 0 | 0.0 | 23.2203 | 0 | [1, 671] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_222634__743.json | 0.0 | missing | missing | missing | |
| 3334 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_015151__729 | 0 | 0.0 | 14.7211 | 0 | [154, 358] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_015151__729.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3335 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_015207__486 | 0 | 0.0 | 15.7424 | 0 | [154, 384] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_015207__486.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3336 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231219_222501__871 | 0 | 0.0 | 26.7759 | 0 | [1, 762] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_222501__871.json | 50.0 | missing | missing | missing | |
| 3337 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231219_222521__419 | 0 | 0.0 | 20.1413 | 0 | [1, 589] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_222521__419.json | 50.0 | missing | missing | missing | |
| 3338 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231225_015119__356 | 0 | 0.0 | 14.397 | 0 | [157, 350] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_015119__356.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3339 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_015136__517 | 0 | 0.0 | 17.3861 | 0 | [157, 425] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_015136__517.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3340 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231226_225055__303 | 3 | 0.0 | 16.6469 | 5 | [157, 406] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231226_225055__303.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3341 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_222400__187 | 0 | 0.0 | 12.9663 | 0 | [1, 387] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_222400__187.json | 50.0 | missing | missing | missing | |
| 3342 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_222416__116 | 0 | 0.0 | 15.4509 | 0 | [1, 456] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_222416__116.json | 50.0 | missing | missing | missing | |
| 3343 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_015056__685 | 1 | 0.0 | 16.2827 | 1 | [198, 388] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_015056__685.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3344 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_015104__152 | 0 | 0.0 | 7.98344 | 1 | [198, 177] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_015104__152.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3345 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_225039__625 | 0 | 0.0 | 7.07708 | 0 | [198, 154] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_225039__625.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3346 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231219_222315__711 | 1 | 0.0 | 23.9404 | 5 | [1, 662] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_222315__711.json | 80.0 | missing | missing | missing | |
| 3347 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231219_222334__354 | 0 | 0.0 | 18.5604 | 1 | [1, 525] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_222334__354.json | 55.0 | missing | missing | missing | |
| 3348 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_015037__424 | 0 | 0.0 | 22.4648 | 5 | [294, 381] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_015037__424.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3349 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_015040__838 | 0 | 0.0 | 2.92654 | 0 | [294, 33] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_015040__838.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3350 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_225031__143 | 2 | 0.0 | 28.3601 | 5 | [294, 536] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_225031__143.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3351 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_222740__427 | 0 | 0.0 | 21.8417 | 0 | [1, 581] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_222740__427.json | 0.0 | missing | missing | missing | |
| 3352 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_222801__837 | 0 | 0.0 | 20.3479 | 0 | [1, 544] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_222801__837.json | 0.0 | missing | missing | missing | |
| 3353 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_015307__199 | 0 | 0.0 | 21.105 | 0 | [465, 460] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_015307__199.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3354 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_015329__392 | 0 | 0.0 | 21.8849 | 0 | [465, 479] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_015329__392.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3355 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_225136__679 | 0 | 0.0 | 18.8599 | 0 | [465, 406] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_225136__679.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3356 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_222712__892 | 0 | 0.0 | 10.7321 | 0 | [1, 298] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_222712__892.json | 0.0 | missing | missing | missing | |
| 3357 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_222714__343 | 0 | 0.0 | 2.15599 | 0 | [1, 62] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_222714__343.json | 0.0 | missing | missing | missing | |
| 3358 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_015224__953 | 0 | 0.0 | 17.205 | 0 | [463, 366] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_015224__953.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3359 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_015246__442 | 0 | 0.0 | 21.7891 | 5 | [463, 477] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_015246__442.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3360 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_225117__780 | 0 | 0.0 | 22.1063 | 5 | [463, 484] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_225117__780.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3361 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_225911__715 | 1 | 0.0 | 15.9333 | 1 | [156, 490] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_225911__715.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3362 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_225922__293 | 0 | 0.0 | 10.3067 | 0 | [156, 312] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_225922__293.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3363 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_225933__478 | 4 | 0.0 | 11.7945 | 4 | [156, 360] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_225933__478.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3364 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_225945__894 | 1 | 0.0 | 11.1702 | 1 | [156, 340] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_225945__894.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3365 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_225958__799 | 1 | 0.0 | 12.4873 | 1 | [156, 382] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_225958__799.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3366 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_225830__509 | 0 | 0.0 | 5.70508 | 0 | [197, 150] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_225830__509.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3367 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_225837__540 | 1 | 0.0 | 6.69868 | 1 | [197, 186] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_225837__540.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3368 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_225842__232 | 1 | 0.0 | 5.73666 | 5 | [197, 155] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_225842__232.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3369 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_225849__573 | 0 | 0.0 | 6.53278 | 0 | [197, 181] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_225849__573.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3370 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_225855__594 | 1 | 0.0 | 6.15435 | 1 | [197, 169] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_225855__594.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3371 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_225714__831 | 1 | 0.0 | 16.0615 | 1 | [293, 439] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_225714__831.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3372 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_225732__527 | 1 | 0.0 | 18.3042 | 1 | [293, 531] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_225732__527.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3373 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_225748__666 | 0 | 0.0 | 15.7005 | 0 | [293, 452] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_225748__666.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3374 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_225807__160 | 2 | 0.0 | 19.1978 | 4 | [293, 559] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_225807__160.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3375 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_225823__367 | 2 | 0.0 | 14.8073 | 5 | [293, 425] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_225823__367.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3376 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_230132__800 | 3 | 0.0 | 13.1829 | 4 | [464, 344] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_230132__800.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3377 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_230148__510 | 1 | 0.0 | 15.2176 | 1 | [464, 405] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_230148__510.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3378 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_230159__638 | 1 | 0.0 | 11.3118 | 1 | [464, 286] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_230159__638.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3379 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_230216__981 | 0 | 0.0 | 16.3178 | 0 | [464, 438] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_230216__981.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3380 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_230231__704 | 1 | 0.0 | 15.0326 | 1 | [464, 399] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_230231__704.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3381 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_230020__408 | 1 | 0.0 | 22.1162 | 1 | [462, 608] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_230020__408.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3382 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_230033__631 | 1 | 0.0 | 13.1995 | 1 | [462, 344] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_230033__631.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3383 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_230046__365 | 1 | 0.0 | 12.3823 | 1 | [462, 319] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_230046__365.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3384 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_230102__391 | 0 | 0.0 | 15.8378 | 0 | [462, 423] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_230102__391.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3385 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_230119__920 | 0 | 0.0 | 17.2961 | 0 | [462, 467] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_230119__920.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3386 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_230518__139 | 0 | 0.0 | 21.1633 | 0 | [156, 517] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_230518__139.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3387 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_230537__924 | 0 | 0.0 | 18.8878 | 0 | [156, 461] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_230537__924.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3388 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_230551__192 | 0 | 0.0 | 13.7787 | 0 | [156, 333] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_230551__192.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3389 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_230606__197 | 0 | 0.0 | 15.2959 | 0 | [156, 371] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_230606__197.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3390 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_230621__315 | 0 | 0.0 | 14.5072 | 0 | [156, 351] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_230621__315.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3391 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_230421__676 | 2 | 0.0 | 8.71803 | 4 | [197, 195] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_230421__676.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3392 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_230430__104 | 1 | 0.0 | 8.81139 | 1 | [197, 197] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_230430__104.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3393 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_230437__216 | 4 | 0.0 | 7.28573 | 5 | [197, 158] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_230437__216.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3394 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_230446__228 | 0 | 0.0 | 8.34264 | 0 | [197, 185] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_230446__228.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3395 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_230457__688 | 1 | 0.0 | 10.7599 | 1 | [197, 247] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_230457__688.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3396 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_230258__150 | 0 | 0.0 | 26.8734 | 5 | [293, 608] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_230258__150.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3397 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_230311__192 | 2 | 0.0 | 13.5903 | 5 | [293, 303] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_230311__192.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3398 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_230325__758 | 0 | 0.0 | 13.4033 | 0 | [293, 298] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_230325__758.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3399 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_230345__699 | 0 | 0.0 | 20.2852 | 0 | [293, 468] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_230345__699.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3400 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_230412__400 | 0 | 0.0 | 26.8961 | 0 | [293, 628] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_230412__400.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3401 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_230835__730 | 0 | 0.0 | 22.9244 | 0 | [464, 501] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_230835__730.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3402 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_230856__890 | 3 | 0.0 | 19.9997 | 4 | [464, 432] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_230856__890.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3403 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_230914__401 | 2 | 0.0 | 18.1412 | 2 | [464, 387] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_230914__401.json | 70.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3404 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_230940__871 | 0 | 0.0 | 26.4502 | 0 | [464, 585] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_230940__871.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3405 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_231009__823 | 0 | 0.0 | 28.495 | 0 | [464, 633] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_231009__823.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3406 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_230638__871 | 1 | 0.0 | 17.1058 | 1 | [462, 362] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_230638__871.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3407 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_230655__127 | 1 | 0.0 | 16.9649 | 1 | [462, 358] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_230655__127.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3408 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_230724__740 | 5 | 0.0 | 28.5364 | 5 | [462, 634] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_230724__740.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3409 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_230747__174 | 5 | 0.0 | 22.4799 | 5 | [462, 491] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_230747__174.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3410 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_230812__219 | 4 | 0.0 | 24.7566 | 5 | [462, 545] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_230812__219.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3411 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231226_120725__675 | 0 | 0.0 | 24.0426 | 0 | [153, 430] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_120725__675.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3412 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231226_120744__816 | 0 | 0.0 | 19.0995 | 0 | [153, 340] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_120744__816.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3413 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_120640__917 | 1 | 0.0 | 18.2328 | 1 | [156, 322] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_120640__917.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3414 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_120701__309 | 0 | 0.0 | 20.5566 | 0 | [156, 366] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_120701__309.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3415 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_225530__424 | 1 | 0.0 | 19.8629 | 1 | [156, 356] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_225530__424.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3416 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_120612__402 | 0 | 0.0 | 8.81605 | 0 | [197, 142] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_120612__402.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3417 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_120622__860 | 0 | 0.0 | 9.82826 | 0 | [197, 159] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_120622__860.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3418 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_225509__756 | 1 | 0.0 | 11.4169 | 1 | [197, 192] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_225509__756.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3419 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_120535__349 | 0 | 0.0 | 32.0354 | 0 | [293, 553] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_120535__349.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3420 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_120603__955 | 0 | 0.0 | 27.7579 | 0 | [293, 476] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_120603__955.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3421 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_225458__605 | 0 | 0.0 | 34.0564 | 0 | [293, 436] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_225458__605.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3422 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_120903__197 | 2 | 0.0 | 21.4305 | 5 | [464, 338] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_120903__197.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3423 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_120931__938 | 1 | 0.0 | 28.3009 | 1 | [464, 455] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_120931__938.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3424 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_225633__644 | 1 | 0.0 | 23.733 | 1 | [464, 385] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_225633__644.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3425 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_120809__433 | 2 | 0.0 | 24.9299 | 4 | [462, 396] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_120809__433.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3426 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_120842__828 | 0 | 0.0 | 32.2973 | 0 | [462, 522] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_120842__828.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3427 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_225609__676 | 1 | 0.0 | 38.9794 | 1 | [462, 655] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_225609__676.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3428 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_105356__462 | 3 | 0.0 | 61.9122 | 5 | [159, 340] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_105356__462.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3429 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_105529__388 | 1 | 0.0 | 93.0929 | 1 | [159, 507] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_105529__388.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3430 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_105614__471 | 2 | 0.0 | 45.0402 | 5 | [159, 250] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_105614__471.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3431 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_143856__832 | 0 | 0.0 | 56.1772 | 0 | [159, 318] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_143856__832.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3432 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_144018__126 | 3 | 0.0 | 81.2281 | 4 | [159, 467] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_144018__126.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3433 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_105135__108 | 1 | 0.0 | 59.1567 | 2 | [198, 327] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_105135__108.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3434 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_105223__823 | 0 | 0.0 | 47.5428 | 0 | [198, 257] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_105223__823.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3435 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_105254__284 | 0 | 0.0 | 30.5051 | 0 | [198, 154] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_105254__284.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3436 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_143710__840 | 0 | 0.0 | 39.9977 | 0 | [198, 208] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_143710__840.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3437 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_143800__642 | 0 | 0.0 | 50.4785 | 0 | [198, 274] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_143800__642.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3438 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_104819__420 | 0 | 0.0 | 113.663 | 0 | [290, 598] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_104819__420.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3439 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_104945__837 | 1 | 0.0 | 86.1574 | 1 | [290, 469] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_104945__837.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3440 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_105036__260 | 0 | 0.0 | 50.6996 | 0 | [290, 260] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_105036__260.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3441 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_143459__836 | 1 | 0.0 | 73.7959 | 1 | [290, 395] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_143459__836.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3442 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_143628__841 | 2 | 0.0 | 87.8332 | 5 | [290, 477] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_143628__841.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3443 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_110048__247 | 0 | 0.0 | 12.6012 | 0 | [472, 4] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_110048__247.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3444 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_110146__675 | 0 | 0.0 | 58.8394 | 0 | [472, 278] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_110146__675.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3445 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_110159__431 | 0 | 0.0 | 12.7126 | 0 | [472, 5] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_110159__431.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3446 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_144503__859 | 0 | 0.0 | 122.708 | 0 | [472, 641] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_144503__859.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3447 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_144516__720 | 0 | 0.0 | 12.7586 | 0 | [472, 4] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_144516__720.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3448 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_105756__507 | 0 | 0.0 | 100.452 | 0 | [470, 516] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_105756__507.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3449 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_105946__369 | 4 | 0.0 | 110.003 | 5 | [470, 571] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_105946__369.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3450 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_110035__188 | 1 | 0.0 | 48.3991 | 5 | [470, 216] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_110035__188.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3451 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_144048__872 | 0 | 0.0 | 30.1751 | 0 | [470, 108] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_144048__872.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3452 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_144300__964 | 3 | 0.0 | 131.459 | 5 | [470, 690] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_144300__964.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3453 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_223136__447 | 0 | 0.0 | 21.8519 | 0 | [138, 608] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231219_223136__447.json | 0.0 | missing | missing | missing | |
| 3454 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_223159__635 | 0 | 0.0 | 22.6528 | 0 | [1, 656] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231219_223159__635.json | 0.0 | missing | missing | missing | |
| 3455 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_223218__413 | 0 | 0.0 | 18.9603 | 0 | [1, 558] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231219_223218__413.json | 0.0 | missing | missing | missing | |
| 3456 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_015518__795 | 0 | 0.0 | 15.3485 | 0 | [162, 369] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231225_015518__795.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3457 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_015535__556 | 0 | 0.0 | 17.5846 | 0 | [162, 426] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231225_015535__556.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3458 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_223056__316 | 0 | 0.0 | 19.3416 | 0 | [1, 568] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_223056__316.json | 0.0 | missing | missing | missing | |
| 3459 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231219_223115__857 | 0 | 0.0 | 18.6061 | 4 | [1, 548] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_223115__857.json | 70.0 | missing | missing | missing | |
| 3460 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_015450__735 | 0 | 0.0 | 12.9547 | 4 | [165, 309] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_015450__735.json | 70.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3461 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_015502__141 | 2 | 0.0 | 12.2239 | 3 | [165, 290] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_015502__141.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3462 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231226_225220__814 | 0 | 0.0 | 15.5551 | 1 | [165, 374] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231226_225220__814.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3463 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_223004__444 | 0 | 0.0 | 20.6822 | 0 | [1, 597] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_223004__444.json | 0.0 | missing | missing | missing | |
| 3464 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_223017__781 | 0 | 0.0 | 13.1026 | 0 | [1, 391] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_223017__781.json | 50.0 | missing | missing | missing | |
| 3465 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_015420__723 | 0 | 0.0 | 9.12353 | 4 | [206, 206] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_015420__723.json | 70.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3466 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_015437__182 | 0 | 0.0 | 16.203 | 0 | [206, 385] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_015437__182.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3467 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_225205__403 | 0 | 0.0 | 10.0376 | 4 | [206, 229] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_225205__403.json | 70.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3468 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_222906__163 | 0 | 0.0 | 24.1287 | 0 | [1, 667] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_222906__163.json | 0.0 | missing | missing | missing | |
| 3469 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231219_222928__143 | 1 | 0.0 | 22.021 | 1 | [1, 614] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_222928__143.json | 60.0 | missing | missing | missing | |
| 3470 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_015351__830 | 0 | 0.0 | 21.9191 | 1 | [302, 353] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_015351__830.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3471 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_015411__150 | 0 | 0.0 | 19.1149 | 1 | [302, 440] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_015411__150.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3472 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_225154__419 | 1 | 0.0 | 17.8548 | 1 | [302, 261] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_225154__419.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3473 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_223448__912 | 0 | 0.0 | 8.52838 | 0 | [1, 239] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_223448__912.json | 0.0 | missing | missing | missing | |
| 3474 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_223528__140 | 0 | 0.0 | 40.0495 | 0 | [1, 1002] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_223528__140.json | 0.0 | missing | missing | missing | |
| 3475 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_015618__110 | 0 | 0.0 | 11.5996 | 0 | [473, 228] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_015618__110.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3476 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_015639__114 | 4 | 0.0 | 21.0126 | 5 | [473, 457] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_015639__114.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3477 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_225301__735 | 0 | 0.0 | 23.8841 | 0 | [473, 525] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_225301__735.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3478 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_223336__637 | 0 | 0.0 | 22.2286 | 0 | [1, 591] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_223336__637.json | 0.0 | missing | missing | missing | |
| 3479 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_223437__875 | 0 | 0.0 | 60.9232 | 1 | [1, 1436] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_223437__875.json | 55.0 | missing | missing | missing | |
| 3480 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_015551__447 | 2 | 0.0 | 15.7896 | 5 | [471, 331] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_015551__447.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3481 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_015607__441 | 4 | 0.0 | 14.6284 | 5 | [471, 302] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_015607__441.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3482 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_225237__866 | 2 | 0.0 | 15.655 | 5 | [471, 327] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_225237__866.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3483 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231213_235409__765 | 0 | 0.0 | 23.4222 | 0 | [138, 649] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231213_235409__765.json | 0.0 | missing | missing | missing | |
| 3484 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231225_004741__101 | 0 | 0.0 | 12.8666 | 0 | [160, 400] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231225_004741__101.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3485 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231225_004758__560 | 0 | 0.0 | 16.8244 | 0 | [160, 525] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231225_004758__560.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3486 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231213_235345__285 | 0 | 0.0 | 16.754 | 4 | [155, 466] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231213_235345__285.json | 70.0 | missing | missing | missing | |
| 3487 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231225_004718__660 | 4 | 0.0 | 13.35 | 5 | [163, 411] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_004718__660.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3488 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231225_004728__700 | 1 | 0.0 | 9.08286 | 1 | [163, 269] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_004728__700.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3489 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231226_222238__996 | 5 | 0.0 | 13.9987 | 5 | [163, 433] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231226_222238__996.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3490 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_235328__807 | 0 | 0.0 | 16.798 | 3 | [184, 455] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231213_235328__807.json | 65.0 | missing | missing | missing | |
| 3491 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_004654__131 | 4 | 0.0 | 11.3195 | 5 | [204, 339] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_004654__131.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3492 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_004704__359 | 2 | 0.0 | 10.4713 | 5 | [204, 312] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_004704__359.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3493 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_222224__606 | 3 | 0.0 | 12.8794 | 5 | [204, 390] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231226_222224__606.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3494 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_235311__722 | 0 | 0.0 | 18.7638 | 0 | [280, 469] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231213_235311__722.json | 0.0 | missing | missing | missing | |
| 3495 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_004624__360 | 3 | 0.0 | 24.4981 | 5 | [300, 563] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_004624__360.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3496 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_004642__263 | 0 | 0.0 | 15.8772 | 4 | [300, 459] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_004642__263.json | 70.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3497 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_222210__357 | 4 | 0.0 | 22.5858 | 5 | [300, 518] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231226_222210__357.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3498 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_235451__682 | 0 | 0.0 | 17.9051 | 0 | [11, 476] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231213_235451__682.json | 50.0 | missing | missing | missing | |
| 3499 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_004854__827 | 1 | 0.0 | 18.6227 | 2 | [471, 513] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_004854__827.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3500 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_004907__397 | 0 | 0.0 | 13.1066 | 0 | [471, 344] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_004907__397.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3501 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_222311__482 | 3 | 0.0 | 18.4136 | 5 | [471, 508] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231226_222311__482.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3502 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | false | 5 | 20231213_235433__829 | 0 | 0.0 | 24.1298 | 0 | [455, 525] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231213_235433__829.json | 25.0 | missing | missing | missing | |
| 3503 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_004815__305 | 0 | 0.0 | 16.797 | 1 | [469, 458] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_004815__305.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3504 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_004834__688 | 3 | 0.0 | 19.6766 | 5 | [469, 545] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_004834__688.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3505 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_222252__805 | 4 | 0.0 | 12.9699 | 5 | [469, 339] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231226_222252__805.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3506 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231214_001047__106 | 0 | 0.0 | 20.9479 | 0 | [138, 582] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__AsIs__1SHOT__20231214_001047__106.json | 0.0 | missing | missing | missing | |
| 3507 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231225_011813__156 | 0 | 0.0 | 4.04314 | 0 | [155, 52] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__AsIs__1SHOT__20231225_011813__156.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3508 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231225_011816__946 | 0 | 0.0 | 3.39199 | 0 | [155, 40] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__AsIs__1SHOT__20231225_011816__946.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3509 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231214_001026__197 | 0 | 0.0 | 22.0328 | 0 | [155, 609] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__InJulia__1SHOT__20231214_001026__197.json | 0.0 | missing | missing | missing | |
| 3510 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_011805__320 | 0 | 0.0 | 4.34159 | 0 | [158, 58] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__InJulia__1SHOT__20231225_011805__320.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3511 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_011808__694 | 0 | 0.0 | 3.93989 | 0 | [158, 50] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__InJulia__1SHOT__20231225_011808__694.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3512 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231226_223358__776 | 0 | 0.0 | 2.81436 | 0 | [158, 29] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__InJulia__1SHOT__20231226_223358__776.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3513 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_001004__106 | 0 | 0.0 | 19.445 | 0 | [184, 527] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231214_001004__106.json | 50.0 | missing | missing | missing | |
| 3514 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_011744__360 | 0 | 0.0 | 20.5312 | 0 | [197, 348] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_011744__360.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3515 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_011800__700 | 0 | 0.0 | 16.5377 | 0 | [197, 275] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_011800__700.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3516 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_223355__621 | 0 | 0.0 | 19.9785 | 0 | [197, 339] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231226_223355__621.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3517 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_000944__714 | 0 | 0.0 | 24.4597 | 0 | [280, 617] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231214_000944__714.json | 0.0 | missing | missing | missing | |
| 3518 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_011657__568 | 0 | 0.0 | 39.3985 | 0 | [293, 492] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_011657__568.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3519 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_011723__741 | 1 | 0.0 | 26.1396 | 1 | [293, 428] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_011723__741.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3520 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_223335__733 | 0 | 0.0 | 16.8404 | 0 | [293, 96] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231226_223335__733.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3521 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_001129__334 | 0 | 0.0 | 20.6436 | 1 | [11, 545] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231214_001129__334.json | 55.0 | missing | missing | missing | |
| 3522 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_012022__436 | 0 | 0.0 | 32.792 | 1 | [461, 507] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_012022__436.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3523 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_012111__658 | 1 | 0.0 | 48.8575 | 1 | [461, 771] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_012111__658.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3524 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_223527__444 | 0 | 0.0 | 62.3047 | 0 | [461, 988] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231226_223527__444.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3525 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_001108__866 | 1 | 0.0 | 20.5363 | 1 | [455, 435] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231214_001108__866.json | 60.0 | missing | missing | missing | |
| 3526 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_011907__697 | 0 | 0.0 | 51.067 | 0 | [458, 808] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_011907__697.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3527 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_011949__181 | 0 | 0.0 | 41.9574 | 0 | [458, 660] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_011949__181.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3528 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_223425__680 | 0 | 0.0 | 27.0679 | 0 | [458, 412] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231226_223425__680.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3529 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231219_223840__637 | 0 | 0.0 | 15.7777 | 0 | [138, 443] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231219_223840__637.json | 0.0 | missing | missing | missing | |
| 3530 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231219_223907__228 | 0 | 0.0 | 26.2571 | 0 | [1, 749] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231219_223907__228.json | 0.0 | missing | missing | missing | |
| 3531 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231219_223934__833 | 0 | 0.0 | 27.3245 | 0 | [1, 776] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231219_223934__833.json | 0.0 | missing | missing | missing | |
| 3532 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231225_015740__142 | 0 | 0.0 | 4.68779 | 0 | [147, 169] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231225_015740__142.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3533 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231225_015806__246 | 0 | 0.0 | 26.4038 | 0 | [147, 952] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231225_015806__246.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3534 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231219_223809__387 | 0 | 0.0 | 22.5997 | 0 | [1, 654] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_223809__387.json | 50.0 | missing | missing | missing | |
| 3535 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231219_223824__221 | 0 | 0.0 | 15.101 | 1 | [1, 452] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_223824__221.json | 55.0 | missing | missing | missing | |
| 3536 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_015713__952 | 0 | 0.0 | 10.9863 | 0 | [150, 410] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_015713__952.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3537 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_015735__753 | 0 | 0.0 | 21.713 | 0 | [150, 793] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_015735__753.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3538 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231226_225402__343 | 0 | 0.0 | 21.0231 | 0 | [150, 767] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231226_225402__343.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3539 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_223711__411 | 0 | 0.0 | 17.5883 | 0 | [1, 514] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_223711__411.json | 50.0 | missing | missing | missing | |
| 3540 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_223724__622 | 0 | 0.0 | 12.6206 | 0 | [1, 377] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_223724__622.json | 50.0 | missing | missing | missing | |
| 3541 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_015656__129 | 0 | 0.0 | 5.43524 | 0 | [187, 193] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_015656__129.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3542 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_015702__895 | 0 | 0.0 | 6.33638 | 0 | [187, 228] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_015702__895.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3543 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_225341__647 | 0 | 0.0 | 28.3147 | 0 | [187, 1001] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_225341__647.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3544 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_223616__499 | 0 | 0.0 | 23.0583 | 0 | [1, 640] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_223616__499.json | 0.0 | missing | missing | missing | |
| 3545 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231219_223638__405 | 0 | 0.0 | 21.2794 | 4 | [1, 595] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_223638__405.json | 70.0 | missing | missing | missing | |
| 3546 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_015644__467 | 0 | 0.0 | 4.59122 | 0 | [271, 1] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_015644__467.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3547 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_015650__598 | 0 | 0.0 | 6.32847 | 0 | [271, 211] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_015650__598.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3548 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_225312__592 | 0 | 0.0 | 11.8317 | 0 | [271, 287] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_225312__592.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3549 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_224130__138 | 0 | 0.0 | 21.2263 | 0 | [1, 566] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_224130__138.json | 50.0 | missing | missing | missing | |
| 3550 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_224150__477 | 0 | 0.0 | 19.8011 | 0 | [1, 531] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_224150__477.json | 50.0 | missing | missing | missing | |
| 3551 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_015846__818 | 0 | 0.0 | 1.50748 | 0 | [439, 1] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_015846__818.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3552 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_015852__132 | 0 | 0.0 | 6.08236 | 0 | [439, 172] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_015852__132.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3553 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_225424__224 | 0 | 0.0 | 11.8003 | 0 | [439, 377] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_225424__224.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3554 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_224028__828 | 0 | 0.0 | 30.0108 | 0 | [1, 776] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_224028__828.json | 0.0 | missing | missing | missing | |
| 3555 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_224051__875 | 1 | 0.0 | 22.793 | 1 | [1, 605] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_224051__875.json | 60.0 | missing | missing | missing | |
| 3556 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_015817__408 | 0 | 0.0 | 11.4897 | 0 | [436, 367] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_015817__408.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3557 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_015844__875 | 0 | 0.0 | 26.6979 | 0 | [436, 877] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_015844__875.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3558 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_225412__773 | 0 | 0.0 | 10.3172 | 0 | [436, 325] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_225412__773.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3559 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231214_001302__183 | 0 | 0.0 | 30.6008 | 0 | [138, 831] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231214_001302__183.json | 0.0 | missing | missing | missing | |
| 3560 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | AsIs | 1SHOT | true | true | 5 | 20231225_012741__768 | 3 | 0.0 | 55.9215 | 4 | [163, 417] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_012741__768.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3561 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231225_012827__181 | 0 | 0.0 | 45.464 | 0 | [163, 335] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_012827__181.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3562 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231214_001231__249 | 0 | 0.0 | 20.1494 | 0 | [155, 560] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231214_001231__249.json | 50.0 | missing | missing | missing | |
| 3563 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_012606__238 | 4 | 0.0 | 49.7197 | 5 | [166, 368] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_012606__238.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3564 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_012645__152 | 4 | 0.0 | 38.5566 | 5 | [166, 280] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_012645__152.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3565 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231226_223858__545 | 3 | 0.0 | 54.1394 | 4 | [166, 399] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231226_223858__545.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3566 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_001211__932 | 0 | 0.0 | 18.3204 | 0 | [184, 497] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231214_001211__932.json | 0.0 | missing | missing | missing | |
| 3567 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_012433__789 | 4 | 0.0 | 60.8031 | 5 | [205, 448] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_012433__789.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3568 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_012516__362 | 3 | 0.0 | 43.1207 | 4 | [205, 310] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_012516__362.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3569 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_223804__214 | 4 | 0.0 | 72.3195 | 5 | [205, 536] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231226_223804__214.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3570 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_001153__399 | 0 | 0.0 | 23.8805 | 0 | [280, 601] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_001153__399.json | 25.0 | missing | missing | missing | |
| 3571 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_012234__304 | 0 | 0.0 | 83.1678 | 0 | [301, 425] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_012234__304.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3572 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_012332__150 | 3 | 0.0 | 57.5829 | 4 | [301, 404] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_012332__150.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3573 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_223651__233 | 4 | 0.0 | 84.0566 | 5 | [301, 448] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_223651__233.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3574 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_001401__286 | 0 | 0.0 | 22.1583 | 0 | [11, 583] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_001401__286.json | 0.0 | missing | missing | missing | |
| 3575 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_013049__737 | 0 | 0.0 | 31.7041 | 0 | [469, 172] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_013049__737.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3576 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_013126__846 | 0 | 0.0 | 36.8812 | 0 | [469, 212] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_013126__846.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3577 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_224111__706 | 1 | 0.0 | 79.2699 | 4 | [469, 534] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_224111__706.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3578 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_001338__274 | 0 | 0.0 | 36.4981 | 0 | [455, 820] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231214_001338__274.json | 0.0 | missing | missing | missing | |
| 3579 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_012925__629 | 0 | 0.0 | 58.2451 | 0 | [466, 376] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_012925__629.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3580 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_013018__301 | 0 | 0.0 | 52.6804 | 0 | [466, 334] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_013018__301.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3581 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_223952__402 | 3 | 0.0 | 53.4697 | 4 | [466, 339] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231226_223952__402.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3582 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_221926__356 | 0 | 0.0 | 29.4568 | 0 | [138, 804] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231219_221926__356.json | 0.0 | missing | missing | missing | |
| 3583 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_221943__664 | 0 | 0.0 | 17.3309 | 0 | [1, 514] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231219_221943__664.json | 0.0 | missing | missing | missing | |
| 3584 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_222004__585 | 0 | 0.0 | 21.3417 | 0 | [1, 621] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231219_222004__585.json | 0.0 | missing | missing | missing | |
| 3585 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_014824__352 | 0 | 0.0 | 17.9921 | 0 | [162, 288] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231225_014824__352.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3586 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_014843__663 | 0 | 0.0 | 19.374 | 0 | [162, 312] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231225_014843__663.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3587 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_221841__402 | 0 | 0.0 | 22.3348 | 0 | [1, 647] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_221841__402.json | 0.0 | missing | missing | missing | |
| 3588 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231219_221856__778 | 0 | 0.0 | 15.4998 | 0 | [1, 463] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_221856__778.json | 50.0 | missing | missing | missing | |
| 3589 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_014735__130 | 1 | 0.0 | 24.5535 | 1 | [165, 400] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_014735__130.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3590 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_014806__329 | 3 | 0.0 | 30.5615 | 5 | [165, 501] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_014806__329.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3591 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231226_224921__441 | 4 | 0.0 | 17.9342 | 5 | [165, 287] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231226_224921__441.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3592 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_221744__173 | 0 | 0.0 | 17.0151 | 0 | [1, 499] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_221744__173.json | 0.0 | missing | missing | missing | |
| 3593 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_221800__388 | 0 | 0.0 | 15.0874 | 0 | [1, 446] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_221800__388.json | 25.0 | missing | missing | missing | |
| 3594 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_014655__745 | 4 | 0.0 | 12.078 | 5 | [206, 181] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_014655__745.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3595 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_014710__494 | 4 | 0.0 | 15.1446 | 5 | [206, 234] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_014710__494.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3596 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_224903__941 | 1 | 0.0 | 18.4789 | 1 | [206, 291] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_224903__941.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3597 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_221646__913 | 0 | 0.0 | 22.4057 | 0 | [1, 624] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_221646__913.json | 0.0 | missing | missing | missing | |
| 3598 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_221708__151 | 0 | 0.0 | 22.0768 | 0 | [1, 613] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_221708__151.json | 0.0 | missing | missing | missing | |
| 3599 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_014624__265 | 1 | 0.0 | 36.6974 | 1 | [302, 419] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_014624__265.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3600 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_014643__691 | 0 | 0.0 | 18.9424 | 0 | [302, 283] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_014643__691.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3601 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_224844__302 | 0 | 0.0 | 36.518 | 0 | [302, 433] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_224844__302.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3602 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_222224__474 | 0 | 0.0 | 27.3763 | 0 | [1, 714] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_222224__474.json | 50.0 | missing | missing | missing | |
| 3603 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_222233__220 | 0 | 0.0 | 8.84217 | 0 | [1, 248] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_222233__220.json | 0.0 | missing | missing | missing | |
| 3604 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_014957__445 | 1 | 0.0 | 26.0856 | 1 | [473, 374] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_014957__445.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3605 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_015014__882 | 4 | 0.0 | 17.0582 | 5 | [473, 225] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_015014__882.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3606 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_225002__870 | 4 | 0.0 | 16.9178 | 5 | [473, 223] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_225002__870.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3607 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_222107__446 | 0 | 0.0 | 32.4677 | 1 | [1, 833] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_222107__446.json | 55.0 | missing | missing | missing | |
| 3608 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_222128__437 | 0 | 0.0 | 21.2514 | 1 | [1, 567] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_222128__437.json | 55.0 | missing | missing | missing | |
| 3609 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_014907__847 | 0 | 0.0 | 23.5198 | 0 | [471, 332] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_014907__847.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3610 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_014930__759 | 4 | 0.0 | 23.614 | 5 | [471, 334] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_014930__759.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3611 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_224945__339 | 0 | 0.0 | 24.5892 | 0 | [471, 350] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_224945__339.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3612 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231214_000753__610 | 0 | 0.0 | 19.4833 | 0 | [138, 544] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__AsIs__1SHOT__20231214_000753__610.json | 0.0 | missing | missing | missing | |
| 3613 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231225_011539__736 | 0 | 0.0 | 8.04745 | 0 | [157, 434] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_011539__736.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3614 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231225_011546__363 | 0 | 0.0 | 7.58581 | 0 | [157, 409] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_011546__363.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3615 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231214_000734__127 | 0 | 0.0 | 37.866 | 0 | [155, 1005] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__InJulia__1SHOT__20231214_000734__127.json | 0.0 | missing | missing | missing | |
| 3616 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | InJulia | 1SHOT | true | true | 5 | 20231225_011526__122 | 0 | 0.0 | 7.11257 | 0 | [160, 385] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_011526__122.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3617 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | InJulia | 1SHOT | true | true | 5 | 20231225_011530__442 | 0 | 0.0 | 4.83747 | 0 | [160, 260] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_011530__442.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3618 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231226_223300__858 | 0 | 0.0 | 5.39952 | 0 | [160, 290] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__InJulia__1SHOT__20231226_223300__858.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3619 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_000656__212 | 0 | 0.0 | 10.3617 | 0 | [184, 274] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231214_000656__212.json | 0.0 | missing | missing | missing | |
| 3620 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_011512__319 | 0 | 0.0 | 6.82681 | 0 | [197, 356] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_011512__319.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3621 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_011518__195 | 0 | 0.0 | 6.85805 | 0 | [197, 358] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_011518__195.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3622 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_223255__535 | 0 | 0.0 | 7.71129 | 0 | [197, 402] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231226_223255__535.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3623 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_000645__674 | 0 | 0.0 | 28.4978 | 0 | [280, 717] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231214_000645__674.json | 50.0 | missing | missing | missing | |
| 3624 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_011459__251 | 0 | 0.0 | 9.18127 | 0 | [279, 310] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_011459__251.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3625 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_011505__704 | 0 | 0.0 | 5.52149 | 0 | [279, 266] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_011505__704.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3626 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_223247__823 | 0 | 0.0 | 10.1047 | 0 | [279, 365] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231226_223247__823.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3627 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_000920__658 | 0 | 0.0 | 31.3978 | 0 | [11, 801] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231214_000920__658.json | 0.0 | missing | missing | missing | |
| 3628 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_011611__815 | 0 | 0.0 | 10.5591 | 0 | [447, 476] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_011611__815.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3629 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_011617__442 | 0 | 0.0 | 6.67559 | 0 | [447, 283] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_011617__442.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3630 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_223318__266 | 0 | 0.0 | 9.38555 | 0 | [447, 417] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231226_223318__266.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3631 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_000848__225 | 1 | 0.0 | 54.9345 | 1 | [455, 1222] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231214_000848__225.json | 60.0 | missing | missing | missing | |
| 3632 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_011553__245 | 0 | 0.0 | 6.95909 | 0 | [445, 297] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_011553__245.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3633 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_011600__646 | 0 | 0.0 | 6.79605 | 0 | [445, 290] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_011600__646.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3634 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_223309__796 | 0 | 0.0 | 8.71017 | 0 | [445, 384] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231226_223309__796.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3635 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231213_235615__665 | 0 | 0.0 | 30.9073 | 0 | [138, 839] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__AsIs__1SHOT__20231213_235615__665.json | 0.0 | missing | missing | missing | |
| 3636 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | AsIs | 1SHOT | true | true | 5 | 20231225_005052__582 | 2 | 0.0 | 12.1187 | 3 | [162, 370] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__AsIs__1SHOT__20231225_005052__582.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3637 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | AsIs | 1SHOT | true | true | 5 | 20231225_005103__644 | 2 | 0.0 | 10.4881 | 5 | [162, 318] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__AsIs__1SHOT__20231225_005103__644.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3638 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231213_235544__486 | 0 | 0.0 | 21.3402 | 0 | [155, 592] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__InJulia__1SHOT__20231213_235544__486.json | 50.0 | missing | missing | missing | |
| 3639 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | InJulia | 1SHOT | false | false | 5 | 20231225_005028__204 | 0 | 0.0 | 11.7793 | 0 | [165, 360] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_005028__204.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3640 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_005040__243 | 1 | 0.0 | 11.6089 | 1 | [165, 355] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_005040__243.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3641 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231226_222355__591 | 2 | 0.0 | 16.2779 | 3 | [165, 503] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__InJulia__1SHOT__20231226_222355__591.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3642 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_235522__672 | 0 | 0.0 | 15.8841 | 0 | [184, 430] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231213_235522__672.json | 50.0 | missing | missing | missing | |
| 3643 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_004959__930 | 1 | 0.0 | 16.7639 | 1 | [206, 509] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_004959__930.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3644 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_005016__850 | 0 | 0.0 | 17.6194 | 0 | [206, 537] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_005016__850.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3645 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_222339__489 | 3 | 0.0 | 9.94697 | 5 | [206, 295] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231226_222339__489.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3646 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231213_235507__289 | 0 | 0.0 | 15.2682 | 0 | [280, 374] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231213_235507__289.json | 25.0 | missing | missing | missing | |
| 3647 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_004929__938 | 1 | 0.0 | 21.3053 | 1 | [302, 458] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_004929__938.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3648 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_004942__582 | 1 | 0.0 | 12.4731 | 1 | [302, 355] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_004942__582.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3649 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_222328__413 | 1 | 0.0 | 16.6966 | 1 | [302, 328] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231226_222328__413.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3650 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231213_235729__852 | 0 | 0.0 | 20.5105 | 0 | [11, 542] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231213_235729__852.json | 25.0 | missing | missing | missing | |
| 3651 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_005152__910 | 1 | 0.0 | 19.1591 | 1 | [473, 528] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_005152__910.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3652 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_005215__302 | 0 | 0.0 | 22.3886 | 0 | [473, 623] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_005215__302.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3653 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_222423__133 | 1 | 0.0 | 13.6871 | 1 | [473, 364] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231226_222423__133.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3654 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_235708__180 | 0 | 0.0 | 52.9346 | 0 | [455, 1179] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231213_235708__180.json | 0.0 | missing | missing | missing | |
| 3655 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_005119__447 | 0 | 0.0 | 16.2618 | 0 | [471, 441] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_005119__447.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3656 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_005133__347 | 1 | 0.0 | 13.5624 | 1 | [471, 360] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_005133__347.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3657 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_222409__343 | 3 | 0.0 | 13.7029 | 5 | [471, 364] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231226_222409__343.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3658 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200243__386 | 4 | 0.000397 | 1.80001 | 5 | [143, 217] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200243__386.json | 95.0 | missing | missing | missing | |
| 3659 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200245__822 | 4 | 0.00034 | 1.59593 | 5 | [143, 179] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200245__822.json | 95.0 | missing | missing | missing | |
| 3660 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200247__721 | 4 | 0.0003505 | 1.57927 | 5 | [143, 186] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200247__721.json | 95.0 | missing | missing | missing | |
| 3661 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200249__565 | 4 | 0.00043 | 1.80365 | 5 | [143, 239] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200249__565.json | 95.0 | missing | missing | missing | |
| 3662 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200250__955 | 4 | 0.000373 | 1.46162 | 5 | [143, 201] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200250__955.json | 95.0 | missing | missing | missing | |
| 3663 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200237__879 | 3 | 0.000287 | 1.26968 | 4 | [178, 132] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200237__879.json | 85.0 | missing | missing | missing | |
| 3664 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200238__284 | 2 | 0.000329 | 1.30805 | 5 | [178, 160] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200238__284.json | 85.0 | missing | missing | missing | |
| 3665 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200239__318 | 4 | 0.0002765 | 1.01241 | 5 | [178, 125] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200239__318.json | 95.0 | missing | missing | missing | |
| 3666 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200240__864 | 4 | 0.000275 | 1.0883 | 5 | [178, 124] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200240__864.json | 95.0 | missing | missing | missing | |
| 3667 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200242__130 | 4 | 0.000284 | 1.24244 | 5 | [178, 130] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200242__130.json | 95.0 | missing | missing | missing | |
| 3668 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_200230__335 | 0 | 0.000333 | 1.073 | 0 | [255, 137] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200230__335.json | 0.0 | missing | missing | missing | |
| 3669 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200231__847 | 4 | 0.000363 | 1.19729 | 5 | [255, 157] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200231__847.json | 95.0 | missing | missing | missing | |
| 3670 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_200233__266 | 0 | 0.000393 | 1.431 | 0 | [255, 177] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200233__266.json | 0.0 | missing | missing | missing | |
| 3671 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200234__279 | 4 | 0.000369 | 1.23028 | 5 | [255, 161] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200234__279.json | 95.0 | missing | missing | missing | |
| 3672 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_200235__403 | 0 | 0.0003375 | 1.37263 | 0 | [255, 140] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200235__403.json | 0.0 | missing | missing | missing | |
| 3673 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200301__815 | 0 | 0.000477 | 1.59099 | 0 | [402, 184] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200301__815.json | 0.0 | missing | missing | missing | |
| 3674 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200302__917 | 0 | 0.000366 | 0.964057 | 0 | [402, 110] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200302__917.json | 0.0 | missing | missing | missing | |
| 3675 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200303__457 | 0 | 0.000348 | 1.08778 | 0 | [402, 98] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200303__457.json | 0.0 | missing | missing | missing | |
| 3676 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_200304__600 | 3 | 0.000414 | 1.34544 | 4 | [402, 142] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200304__600.json | 85.0 | missing | missing | missing | |
| 3677 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200306__530 | 0 | 0.00045 | 1.32399 | 0 | [402, 166] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200306__530.json | 0.0 | missing | missing | missing | |
| 3678 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200252__822 | 4 | 0.0004975 | 1.54803 | 4 | [401, 198] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200252__822.json | 90.0 | missing | missing | missing | |
| 3679 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200253__182 | 5 | 0.0004525 | 1.4603 | 5 | [401, 168] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200253__182.json | 100.0 | missing | missing | missing | |
| 3680 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200255__735 | 3 | 0.0004135 | 1.32354 | 4 | [401, 142] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200255__735.json | 85.0 | missing | missing | missing | |
| 3681 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200256__735 | 0 | 0.0004195 | 1.38104 | 0 | [401, 146] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200256__735.json | 50.0 | missing | missing | missing | |
| 3682 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200259__835 | 2 | 0.000745 | 2.72888 | 3 | [401, 363] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200259__835.json | 75.0 | missing | missing | missing | |
| 3683 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_075820__867 | 3 | 0.01832 | 51.1965 | 3 | [143, 563] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_075820__867.json | 80.0 | missing | missing | missing | |
| 3684 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_075911__470 | 1 | 0.01868 | 50.3468 | 1 | [143, 575] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_075911__470.json | 60.0 | missing | missing | missing | |
| 3685 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_075945__560 | 4 | 0.01607 | 33.9309 | 5 | [143, 488] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_075945__560.json | 95.0 | missing | missing | missing | |
| 3686 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_080025__439 | 4 | 0.0173 | 40.2261 | 5 | [143, 529] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_080025__439.json | 95.0 | missing | missing | missing | |
| 3687 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_080107__474 | 5 | 0.01745 | 41.7904 | 5 | [143, 534] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_080107__474.json | 100.0 | missing | missing | missing | |
| 3688 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_075248__961 | 4 | 0.0067 | 13.2021 | 5 | [178, 164] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_075248__961.json | 95.0 | missing | missing | missing | |
| 3689 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_075310__223 | 5 | 0.01018 | 21.7454 | 5 | [178, 280] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_075310__223.json | 100.0 | missing | missing | missing | |
| 3690 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_075320__677 | 5 | 0.00577 | 9.38106 | 5 | [178, 133] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_075320__677.json | 100.0 | missing | missing | missing | |
| 3691 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_075340__845 | 5 | 0.00697 | 19.8026 | 5 | [178, 173] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_075340__845.json | 100.0 | missing | missing | missing | |
| 3692 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_075347__958 | 1 | 0.00571 | 7.09946 | 1 | [178, 131] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_075347__958.json | 60.0 | missing | missing | missing | |
| 3693 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_074835__867 | 5 | 0.01743 | 37.6441 | 5 | [255, 496] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_074835__867.json | 100.0 | missing | missing | missing | |
| 3694 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_074917__715 | 4 | 0.01824 | 42.0169 | 5 | [255, 523] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_074917__715.json | 95.0 | missing | missing | missing | |
| 3695 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_074941__315 | 0 | 0.01575 | 23.8296 | 0 | [255, 440] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_074941__315.json | 25.0 | missing | missing | missing | |
| 3696 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_075031__803 | 5 | 0.01488 | 50.1296 | 5 | [255, 411] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_075031__803.json | 100.0 | missing | missing | missing | |
| 3697 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_075102__181 | 5 | 0.01779 | 30.6437 | 5 | [255, 508] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_075102__181.json | 100.0 | missing | missing | missing | |
| 3698 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_081304__447 | 5 | 0.01944 | 58.8385 | 5 | [402, 514] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_081304__447.json | 100.0 | missing | missing | missing | |
| 3699 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_081403__327 | 5 | 0.02226 | 59.2479 | 5 | [402, 608] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_081403__327.json | 100.0 | missing | missing | missing | |
| 3700 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_081442__362 | 4 | 0.01896 | 38.8159 | 5 | [402, 498] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_081442__362.json | 95.0 | missing | missing | missing | |
| 3701 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_081519__386 | 4 | 0.01509 | 37.1525 | 5 | [402, 369] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_081519__386.json | 95.0 | missing | missing | missing | |
| 3702 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_081549__773 | 1 | 0.0177 | 29.469 | 1 | [402, 456] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_081549__773.json | 60.0 | missing | missing | missing | |
| 3703 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_080551__991 | 0 | 0.01661 | 30.7065 | 0 | [401, 420] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_080551__991.json | 50.0 | missing | missing | missing | |
| 3704 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_080642__673 | 5 | 0.01574 | 50.6196 | 5 | [401, 391] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_080642__673.json | 100.0 | missing | missing | missing | |
| 3705 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_080720__493 | 5 | 0.01847 | 38.5931 | 5 | [401, 482] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_080720__493.json | 100.0 | missing | missing | missing | |
| 3706 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_080743__760 | 5 | 0.01754 | 22.6643 | 5 | [401, 451] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_080743__760.json | 100.0 | missing | missing | missing | |
| 3707 | Apple-MacBook-Pro-M1 | weather_data_analyzer | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_080855__309 | 4 | 0.0167 | 71.2095 | 4 | [401, 423] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/weather_data_analyzer/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_080855__309.json | 90.0 | missing | missing | missing | |
| 3708 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231213_235851__125 | 0 | 0.0 | 23.0023 | 0 | [138, 637] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__AsIs__1SHOT__20231213_235851__125.json | 0.0 | missing | missing | missing | |
| 3709 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | AsIs | 1SHOT | true | true | 5 | 20231225_005937__111 | 3 | 0.0 | 90.7745 | 5 | [156, 664] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__AsIs__1SHOT__20231225_005937__111.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3710 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | AsIs | 1SHOT | true | true | 5 | 20231225_010026__355 | 4 | 0.0 | 49.5362 | 5 | [156, 358] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__AsIs__1SHOT__20231225_010026__355.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3711 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231213_235828__997 | 0 | 0.0 | 18.3943 | 0 | [155, 512] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__InJulia__1SHOT__20231213_235828__997.json | 50.0 | missing | missing | missing | |
| 3712 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231225_005721__410 | 5 | 0.0 | 77.1516 | 5 | [159, 563] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_005721__410.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3713 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231225_005806__149 | 2 | 0.0 | 43.1652 | 3 | [159, 306] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_005806__149.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3714 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231226_222726__456 | 4 | 0.0 | 70.1857 | 5 | [159, 514] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__InJulia__1SHOT__20231226_222726__456.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3715 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_235809__687 | 0 | 0.0 | 17.33 | 0 | [184, 469] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231213_235809__687.json | 50.0 | missing | missing | missing | |
| 3716 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_005513__720 | 2 | 0.0 | 50.4912 | 5 | [198, 354] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_005513__720.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3717 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_005604__989 | 3 | 0.0 | 50.4547 | 5 | [198, 354] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_005604__989.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3718 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_222616__576 | 4 | 0.0 | 38.8088 | 5 | [198, 267] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231226_222616__576.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3719 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_235752__412 | 0 | 0.0 | 23.3207 | 0 | [280, 587] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231213_235752__412.json | 0.0 | missing | missing | missing | |
| 3720 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_005335__727 | 0 | 0.0 | 80.1162 | 0 | [290, 373] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_005335__727.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3721 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_005423__534 | 3 | 0.0 | 47.5103 | 4 | [290, 313] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_005423__534.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3722 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_222537__371 | 0 | 0.0 | 72.9359 | 0 | [290, 337] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231226_222537__371.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3723 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231213_235940__176 | 0 | 0.0 | 33.8378 | 0 | [11, 857] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231213_235940__176.json | 0.0 | missing | missing | missing | |
| 3724 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_010419__472 | 0 | 0.0 | 90.1615 | 0 | [472, 587] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_010419__472.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3725 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_010516__901 | 0 | 0.0 | 56.9308 | 0 | [472, 349] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_010516__901.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3726 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_222913__255 | 0 | 0.0 | 58.4082 | 0 | [472, 362] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231226_222913__255.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3727 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_235906__520 | 0 | 0.0 | 15.4902 | 0 | [455, 303] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231213_235906__520.json | 0.0 | missing | missing | missing | |
| 3728 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_010143__773 | 0 | 0.0 | 76.3469 | 0 | [470, 489] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_010143__773.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3729 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_010249__535 | 0 | 0.0 | 65.9757 | 0 | [470, 415] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_010249__535.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3730 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_222814__707 | 0 | 0.0 | 48.1711 | 0 | [470, 287] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231226_222814__707.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3731 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | AsIs | 1SHOT | false | false | 3 | 20231214_001914__356 | 0 | 0.0 | 7.71501 | 0 | [59, 233] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231214_001914__356.json | 0.0 | missing | missing | missing | |
| 3732 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | AsIs | 1SHOT | false | false | 3 | 20231225_021329__850 | 0 | 0.0 | 6.14397 | 0 | [81, 103] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_021329__850.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3733 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | AsIs | 1SHOT | false | false | 3 | 20231225_021348__548 | 0 | 0.0 | 19.0113 | 0 | [81, 345] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_021348__548.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3734 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | AsIs | 1SHOT | false | false | 3 | 20231225_161956__960 | 0 | 0.0 | 8.12499 | 0 | [81, 141] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_161956__960.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3735 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | AsIs | 1SHOT | false | false | 3 | 20231225_162000__116 | 0 | 0.0 | 3.84626 | 0 | [81, 59] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_162000__116.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3736 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | InJulia | 1SHOT | true | true | 3 | 20231225_021310__937 | 3 | 0.0 | 3.97743 | 2 | [83, 61] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_021310__937.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3737 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | InJulia | 1SHOT | true | false | 3 | 20231225_021323__594 | 0 | 0.0 | 13.0299 | 0 | [83, 233] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_021323__594.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3738 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | InJulia | 1SHOT | true | true | 3 | 20231225_161936__194 | 3 | 0.0 | 11.0898 | 2 | [83, 198] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_161936__194.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3739 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | InJulia | 1SHOT | true | true | 3 | 20231225_161948__237 | 0 | 0.0 | 12.2507 | 2 | [83, 219] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_161948__237.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3740 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | InJulia | 1SHOT | true | true | 3 | 20231226_230442__770 | 3 | 0.0 | 12.6976 | 2 | [83, 228] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231226_230442__770.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3741 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_021301__445 | 3 | 0.0 | 9.043 | 2 | [122, 153] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_021301__445.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3742 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_021306__477 | 0 | 0.0 | 4.12251 | 0 | [122, 59] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_021306__477.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3743 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_161920__784 | 3 | 0.0 | 4.30776 | 2 | [122, 62] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_161920__784.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3744 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_161924__192 | 3 | 0.0 | 4.17253 | 2 | [122, 60] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_161924__192.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3745 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231226_230429__479 | 3 | 0.0 | 4.3522 | 2 | [122, 64] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231226_230429__479.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3746 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_021248__310 | 3 | 0.0 | 15.0076 | 2 | [205, 62] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_021248__310.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3747 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_021252__782 | 3 | 0.0 | 4.83215 | 2 | [205, 58] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_021252__782.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3748 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_161902__548 | 0 | 0.0 | 14.4342 | 0 | [205, 56] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_161902__548.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3749 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_161916__161 | 3 | 0.0 | 13.2414 | 2 | [205, 215] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_161916__161.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3750 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231226_230424__576 | 3 | 0.0 | 26.7464 | 2 | [205, 293] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231226_230424__576.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3751 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_021501__192 | 3 | 0.0 | 14.3837 | 2 | [387, 202] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_021501__192.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3752 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_021524__647 | 3 | 0.0 | 22.2566 | 2 | [387, 342] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_021524__647.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3753 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_162059__810 | 0 | 0.0 | 22.2164 | 2 | [387, 343] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_162059__810.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3754 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_162106__447 | 0 | 0.0 | 6.76183 | 0 | [387, 64] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_162106__447.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3755 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231226_230520__509 | 0 | 0.0 | 32.5327 | 0 | [387, 522] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231226_230520__509.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3756 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_021415__200 | 0 | 0.0 | 26.5851 | 2 | [384, 422] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_021415__200.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3757 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_021447__720 | 3 | 0.0 | 32.0282 | 2 | [384, 516] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_021447__720.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3758 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_162016__943 | 3 | 0.0 | 16.0209 | 2 | [384, 231] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_162016__943.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3759 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_162037__630 | 3 | 0.0 | 20.8627 | 2 | [384, 322] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_162037__630.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3760 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaRecapTask | 1SHOT | false | false | 3 | 20231226_230448__788 | 0 | 0.0 | 5.8858 | 0 | [384, 52] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231226_230448__788.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3761 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20240131_234015__797 | 3 | 0.0 | 3.28407 | 2 | [0, 242] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_234015__797.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3762 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20240131_234018__256 | 0 | 0.0 | 2.42173 | 2 | [0, 179] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_234018__256.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3763 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20240131_234019__980 | 3 | 0.0 | 1.60534 | 2 | [0, 119] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_234019__980.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3764 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20240131_234021__653 | 3 | 0.0 | 1.30903 | 2 | [0, 97] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_234021__653.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3765 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20240131_234024__735 | 3 | 0.0 | 3.20956 | 2 | [0, 237] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_234024__735.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3766 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20240131_233952__616 | 0 | 0.0 | 0.919532 | 2 | [0, 68] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_233952__616.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3767 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 3 | 20240131_233953__466 | 0 | 0.0 | 0.824912 | 0 | [0, 61] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_233953__466.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3768 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20240131_233954__552 | 3 | 0.0 | 0.823807 | 2 | [0, 61] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_233954__552.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3769 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20240131_233955__128 | 3 | 0.0 | 1.25682 | 2 | [0, 93] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_233955__128.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3770 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20240131_233956__824 | 3 | 0.0 | 0.918976 | 2 | [0, 68] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_233956__824.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3771 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20240131_233932__708 | 3 | 0.0 | 1.34095 | 2 | [0, 97] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_233932__708.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3772 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20240131_233936__971 | 3 | 0.0 | 3.26183 | 2 | [0, 230] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_233936__971.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3773 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20240131_233937__762 | 0 | 0.0 | 0.837675 | 0 | [0, 60] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_233937__762.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3774 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20240131_233941__376 | 0 | 0.0 | 4.1574 | 0 | [0, 295] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_233941__376.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3775 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20240131_233943__598 | 3 | 0.0 | 2.33555 | 2 | [0, 168] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_233943__598.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3776 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240131_234110__245 | 3 | 0.0 | 1.50847 | 2 | [0, 109] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_234110__245.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3777 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240131_234113__304 | 0 | 0.0 | 2.87064 | 2 | [0, 207] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_234113__304.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3778 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20240131_234120__235 | 0 | 0.0 | 6.57552 | 0 | [0, 468] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_234120__235.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3779 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240131_234125__966 | 3 | 0.0 | 5.31517 | 2 | [0, 379] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_234125__966.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3780 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240131_234129__714 | 3 | 0.0 | 4.19325 | 2 | [0, 300] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_234129__714.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3781 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20240131_234045__537 | 3 | 0.0 | 3.08539 | 2 | [0, 222] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_234045__537.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3782 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 3 | 20240131_234046__981 | 0 | 0.0 | 0.610192 | 0 | [0, 44] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_234046__981.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3783 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20240131_234048__816 | 3 | 0.0 | 2.27692 | 2 | [0, 164] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_234048__816.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3784 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20240131_234053__746 | 3 | 0.0 | 4.46606 | 2 | [0, 320] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_234053__746.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3785 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20240131_234057__829 | 3 | 0.0 | 3.91783 | 2 | [0, 281] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_234057__829.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3786 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | AsIs | 1SHOT | false | false | 3 | 20231214_002029__256 | 0 | 0.0 | 7.06151 | 0 | [59, 213] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__AsIs__1SHOT__20231214_002029__256.json | 0.0 | missing | missing | missing | |
| 3787 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | AsIs | 1SHOT | false | false | 3 | 20231225_021754__386 | 0 | 0.0 | 107.641 | 0 | [54, 1808] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__AsIs__1SHOT__20231225_021754__386.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3788 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | AsIs | 1SHOT | false | false | 3 | 20231225_021802__964 | 0 | 0.0 | 7.87445 | 0 | [54, 141] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__AsIs__1SHOT__20231225_021802__964.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3789 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | AsIs | 1SHOT | false | false | 3 | 20231225_162151__285 | 0 | 0.0 | 18.5787 | 0 | [54, 345] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__AsIs__1SHOT__20231225_162151__285.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3790 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | AsIs | 1SHOT | true | false | 3 | 20231225_162158__985 | 0 | 0.0 | 6.94579 | 0 | [54, 124] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__AsIs__1SHOT__20231225_162158__985.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3791 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | InJulia | 1SHOT | false | false | 3 | 20231214_002022__833 | 0 | 0.0 | 6.40511 | 0 | [75, 188] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__InJulia__1SHOT__20231214_002022__833.json | 0.0 | missing | missing | missing | |
| 3792 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | InJulia | 1SHOT | false | false | 3 | 20231225_021600__530 | 0 | 0.0 | 15.3484 | 0 | [57, 283] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_021600__530.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3793 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | InJulia | 1SHOT | false | false | 3 | 20231225_021606__957 | 0 | 0.0 | 5.89997 | 0 | [57, 103] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_021606__957.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3794 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | InJulia | 1SHOT | false | false | 3 | 20231225_162127__954 | 0 | 0.0 | 0.749727 | 0 | [57, 4] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_162127__954.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3795 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | InJulia | 1SHOT | false | false | 3 | 20231225_162132__483 | 0 | 0.0 | 4.86242 | 0 | [57, 84] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_162132__483.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3796 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231214_002015__298 | 0 | 0.0 | 6.45896 | 0 | [105, 179] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231214_002015__298.json | 0.0 | missing | missing | missing | |
| 3797 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_021544__290 | 0 | 0.0 | 7.1357 | 0 | [59, 127] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_021544__290.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3798 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_021545__827 | 0 | 0.0 | 0.76009 | 0 | [59, 4] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_021545__827.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3799 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_162124__433 | 0 | 0.0 | 5.80628 | 0 | [59, 102] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_162124__433.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3800 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_162126__246 | 0 | 0.0 | 2.65232 | 0 | [59, 41] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_162126__246.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3801 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231214_002008__785 | 0 | 0.0 | 15.9813 | 2 | [187, 431] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231214_002008__785.json | 75.0 | missing | missing | missing | |
| 3802 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_021535__590 | 0 | 0.0 | 11.5348 | 0 | [80, 15] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_021535__590.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3803 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_021537__853 | 0 | 0.0 | 1.68872 | 0 | [80, 17] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_021537__853.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3804 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_162117__851 | 0 | 0.0 | 10.8547 | 0 | [80, 1] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_162117__851.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3805 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_162118__559 | 0 | 0.0 | 0.865357 | 0 | [80, 1] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_162118__559.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3806 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231214_002110__479 | 0 | 0.0 | 17.9257 | 2 | [11, 489] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231214_002110__479.json | 75.0 | missing | missing | missing | |
| 3807 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_021812__760 | 0 | 0.0 | 7.40401 | 0 | [76, 127] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_021812__760.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3808 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_162202__710 | 0 | 0.0 | 1.53571 | 0 | [76, 14] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_162202__710.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3809 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_162207__910 | 0 | 0.0 | 4.51848 | 0 | [76, 72] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_162207__910.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3810 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 3 | 20231214_002052__886 | 0 | 0.0 | 22.995 | 0 | [376, 538] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231214_002052__886.json | 0.0 | missing | missing | missing | |
| 3811 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_021804__883 | 0 | 0.0 | 1.6809 | 0 | [73, 17] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_021804__883.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3812 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_162159__226 | 0 | 0.0 | 1.68157 | 0 | [73, 17] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_162159__226.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3813 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_162201__157 | 0 | 0.0 | 1.51064 | 0 | [73, 14] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_162201__157.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3814 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 3 | 20240131_234512__444 | 0 | 0.0 | 6.33142 | 0 | [0, 229] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_234512__444.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3815 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 3 | 20240131_234517__809 | 0 | 0.0 | 5.41608 | 0 | [0, 196] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_234517__809.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3816 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20240131_234523__871 | 3 | 0.0 | 6.13923 | 2 | [0, 222] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_234523__871.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3817 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20240131_234530__362 | 0 | 0.0 | 7.00332 | 2 | [0, 253] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_234530__362.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3818 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 3 | 20240131_234532__813 | 0 | 0.0 | 2.32464 | 0 | [0, 84] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_234532__813.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3819 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 3 | 20240131_234416__474 | 0 | 0.0 | 3.13812 | 0 | [0, 113] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_234416__474.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3820 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20240131_234421__855 | 3 | 0.0 | 4.27851 | 2 | [0, 154] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_234421__855.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3821 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20240131_234425__748 | 3 | 0.0 | 4.43621 | 2 | [0, 160] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_234425__748.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3822 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20240131_234429__222 | 3 | 0.0 | 3.89434 | 2 | [0, 140] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_234429__222.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3823 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20240131_234435__509 | 3 | 0.0 | 5.55422 | 2 | [0, 200] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_234435__509.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3824 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20240131_234334__674 | 0 | 0.0 | 2.59982 | 0 | [0, 93] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_234334__674.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3825 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20240131_234339__199 | 0 | 0.0 | 5.5361 | 2 | [0, 198] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_234339__199.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3826 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20240131_234340__302 | 0 | 0.0 | 0.817776 | 0 | [0, 29] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_234340__302.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3827 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20240131_234341__499 | 0 | 0.0 | 0.776698 | 0 | [0, 28] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_234341__499.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3828 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20240131_234348__281 | 3 | 0.0 | 7.47706 | 2 | [0, 269] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_234348__281.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3829 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20240131_234640__880 | 0 | 0.0 | 2.56685 | 0 | [0, 91] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_234640__880.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3830 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20240131_234641__857 | 0 | 0.0 | 0.114819 | 0 | [0, 4] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_234641__857.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3831 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240131_234650__399 | 3 | 0.0 | 9.20754 | 2 | [0, 327] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_234650__399.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3832 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240131_234653__397 | 0 | 0.0 | 3.44825 | 2 | [0, 123] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_234653__397.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3833 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240131_234702__124 | 0 | 0.0 | 9.13644 | 2 | [0, 324] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_234702__124.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3834 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20240131_234613__478 | 3 | 0.0 | 5.33466 | 2 | [0, 190] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_234613__478.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3835 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20240131_234616__700 | 3 | 0.0 | 2.27038 | 2 | [0, 81] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_234616__700.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3836 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20240131_234624__541 | 3 | 0.0 | 8.4402 | 2 | [0, 300] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_234624__541.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3837 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 3 | 20240131_234626__597 | 0 | 0.0 | 1.84805 | 0 | [0, 66] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_234626__597.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3838 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20240131_234629__648 | 3 | 0.0 | 2.79703 | 2 | [0, 100] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_234629__648.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3839 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 3 | 20240131_233525__217 | 0 | 0.0 | 5.61989 | 2 | [0, 138] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240131_233525__217.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3840 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | false | 3 | 20240131_233537__698 | 0 | 0.0 | 12.1203 | 0 | [0, 296] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240131_233537__698.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3841 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 3 | 20240131_233547__790 | 0 | 0.0 | 10.5065 | 0 | [0, 257] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240131_233547__790.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3842 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | false | 3 | 20240131_233555__562 | 0 | 0.0 | 8.0071 | 0 | [0, 196] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240131_233555__562.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3843 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 3 | 20240131_233613__618 | 0 | 0.0 | 17.2102 | 0 | [0, 421] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240131_233613__618.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3844 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 3 | 20240131_233406__962 | 0 | 0.0 | 10.5043 | 0 | [0, 258] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_233406__962.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3845 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 3 | 20240131_233414__379 | 0 | 0.0 | 7.91825 | 0 | [0, 194] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_233414__379.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3846 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 3 | 20240131_233422__777 | 0 | 0.0 | 8.05288 | 0 | [0, 198] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_233422__777.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3847 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 3 | 20240131_233428__663 | 0 | 0.0 | 5.94218 | 0 | [0, 146] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_233428__663.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3848 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | true | true | 3 | 20240131_233436__188 | 3 | 0.0 | 8.32802 | 2 | [0, 205] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_233436__188.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3849 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20240131_233223__113 | 0 | 0.0 | 5.32917 | 0 | [0, 129] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_233223__113.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3850 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20240131_233230__925 | 0 | 0.0 | 7.10838 | 0 | [0, 172] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_233230__925.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3851 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20240131_233238__134 | 0 | 0.0 | 8.15789 | 0 | [0, 197] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_233238__134.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3852 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20240131_233248__796 | 0 | 0.0 | 9.74968 | 0 | [0, 236] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_233248__796.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3853 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20240131_233301__852 | 3 | 0.0 | 13.0476 | 2 | [0, 315] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_233301__852.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3854 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20240131_233836__812 | 0 | 0.0 | 9.64603 | 0 | [0, 233] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240131_233836__812.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3855 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20240131_233846__194 | 0 | 0.0 | 10.2016 | 0 | [0, 242] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240131_233846__194.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3856 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240131_233856__673 | 0 | 0.0 | 9.88198 | 2 | [0, 236] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240131_233856__673.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3857 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240131_233905__934 | 3 | 0.0 | 9.43377 | 2 | [0, 226] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240131_233905__934.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3858 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20240131_233914__824 | 0 | 0.0 | 8.76873 | 0 | [0, 210] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240131_233914__824.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3859 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 3 | 20240131_233719__118 | 0 | 0.0 | 6.90371 | 0 | [0, 168] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240131_233719__118.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3860 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 3 | 20240131_233729__326 | 3 | 0.0 | 10.2252 | 2 | [0, 248] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240131_233729__326.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3861 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 3 | 20240131_233739__844 | 0 | 0.0 | 9.27016 | 0 | [0, 225] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240131_233739__844.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3862 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 3 | 20240131_233745__588 | 0 | 0.0 | 5.95481 | 0 | [0, 145] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240131_233745__588.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3863 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 3 | 20240131_233753__872 | 0 | 0.0 | 8.63407 | 0 | [0, 209] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240131_233753__872.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3864 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 3 | 20240131_232537__330 | 0 | 0.0 | 14.165 | 0 | [0, 265] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_232537__330.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3865 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20240131_232601__898 | 3 | 0.0 | 23.4141 | 2 | [0, 437] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_232601__898.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3866 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20240131_232613__460 | 3 | 0.0 | 12.0864 | 2 | [0, 226] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_232613__460.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3867 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20240131_232626__218 | 3 | 0.0 | 13.3558 | 2 | [0, 250] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_232626__218.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3868 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20240131_232641__470 | 3 | 0.0 | 14.8492 | 2 | [0, 278] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_232641__470.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3869 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 3 | 20240131_232335__756 | 0 | 0.0 | 12.4563 | 0 | [0, 231] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_232335__756.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3870 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 3 | 20240131_232345__161 | 0 | 0.0 | 9.08156 | 0 | [0, 170] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_232345__161.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3871 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 3 | 20240131_232356__170 | 0 | 0.0 | 11.0734 | 0 | [0, 206] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_232356__170.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3872 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20240131_232405__100 | 0 | 0.0 | 9.66621 | 2 | [0, 181] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_232405__100.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3873 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 3 | 20240131_232414__651 | 0 | 0.0 | 8.27732 | 0 | [0, 155] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_232414__651.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3874 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20240131_232121__999 | 0 | 0.0 | 7.08541 | 2 | [0, 132] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_232121__999.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3875 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20240131_232130__396 | 0 | 0.0 | 9.34855 | 0 | [0, 174] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_232130__396.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3876 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20240131_232139__999 | 0 | 0.0 | 9.29265 | 0 | [0, 173] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_232139__999.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3877 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20240131_232151__209 | 0 | 0.0 | 11.5033 | 0 | [0, 214] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_232151__209.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3878 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20240131_232207__736 | 3 | 0.0 | 16.1209 | 2 | [0, 298] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_232207__736.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3879 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240131_233003__609 | 0 | 0.0 | 10.7841 | 2 | [0, 199] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_233003__609.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3880 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20240131_233016__741 | 0 | 0.0 | 12.911 | 0 | [0, 239] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_233016__741.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3881 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240131_233034__381 | 3 | 0.0 | 18.5806 | 2 | [0, 343] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_233034__381.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3882 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20240131_233047__368 | 0 | 0.0 | 12.6558 | 0 | [0, 234] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_233047__368.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3883 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20240131_233053__691 | 0 | 0.0 | 6.07285 | 0 | [0, 113] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_233053__691.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3884 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20240131_232807__534 | 3 | 0.0 | 22.3309 | 2 | [0, 412] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_232807__534.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3885 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 3 | 20240131_232819__129 | 0 | 0.0 | 11.8956 | 0 | [0, 220] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_232819__129.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3886 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20240131_232836__797 | 3 | 0.0 | 17.5869 | 2 | [0, 324] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_232836__797.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3887 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20240131_232845__201 | 0 | 0.0 | 8.83035 | 2 | [0, 162] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_232845__201.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3888 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 3 | 20240131_232854__112 | 0 | 0.0 | 8.88191 | 0 | [0, 163] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_232854__112.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3889 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20240131_234203__237 | 3 | 0.0 | 0.581542 | 2 | [0, 66] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_234203__237.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3890 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20240131_234205__504 | 0 | 0.0 | 1.8128 | 2 | [0, 209] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_234205__504.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3891 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20240131_234207__577 | 3 | 0.0 | 2.17708 | 2 | [0, 252] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_234207__577.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3892 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20240131_234208__187 | 3 | 0.0 | 0.578096 | 2 | [0, 67] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_234208__187.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3893 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20240131_234208__392 | 3 | 0.0 | 0.570319 | 2 | [0, 66] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_234208__392.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3894 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20240131_234155__565 | 3 | 0.0 | 0.521221 | 2 | [0, 60] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_234155__565.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3895 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 3 | 20240131_234156__498 | 0 | 0.0 | 1.77534 | 0 | [0, 205] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_234156__498.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3896 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20240131_234157__967 | 0 | 0.0 | 0.582017 | 2 | [0, 67] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_234157__967.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3897 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20240131_234158__691 | 3 | 0.0 | 0.590742 | 2 | [0, 66] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_234158__691.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3898 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20240131_234200__397 | 3 | 0.0 | 1.90894 | 2 | [0, 212] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_234200__397.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3899 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20240131_234141__994 | 0 | 0.0 | 2.05046 | 0 | [0, 240] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_234141__994.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3900 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20240131_234142__374 | 0 | 0.0 | 0.493859 | 2 | [0, 58] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_234142__374.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3901 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20240131_234144__921 | 0 | 0.0 | 1.81176 | 0 | [0, 213] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_234144__921.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3902 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20240131_234145__608 | 3 | 0.0 | 0.81566 | 2 | [0, 96] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_234145__608.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3903 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20240131_234146__923 | 0 | 0.0 | 1.00317 | 0 | [0, 117] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_234146__923.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3904 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240131_234247__709 | 2 | 0.0 | 4.2151 | 2 | [0, 482] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_234247__709.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 3905 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240131_234250__267 | 3 | 0.0 | 2.11669 | 2 | [0, 247] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_234250__267.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3906 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240131_234252__773 | 0 | 0.0 | 1.64169 | 2 | [0, 192] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_234252__773.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3907 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20240131_234254__749 | 0 | 0.0 | 2.64053 | 0 | [0, 307] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_234254__749.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3908 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240131_234257__701 | 0 | 0.0 | 2.41383 | 2 | [0, 273] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_234257__701.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3909 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20240131_234221__154 | 3 | 0.0 | 0.613843 | 2 | [0, 72] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_234221__154.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3910 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20240131_234221__426 | 3 | 0.0 | 0.748925 | 2 | [0, 87] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_234221__426.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3911 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20240131_234222__253 | 3 | 0.0 | 0.741016 | 2 | [0, 87] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_234222__253.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3912 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20240131_234223__447 | 3 | 0.0 | 0.572065 | 2 | [0, 67] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_234223__447.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3913 | NVIDIA-RTX-4090-4x | FloatWithUnits | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 3 | 20240131_234226__201 | 0 | 0.0 | 3.26233 | 0 | [0, 373] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_234226__201.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3914 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 3 | 20231219_224341__227 | 0 | 0.0 | 8.57056 | 0 | [1, 272] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_224341__227.json | 0.0 | missing | missing | missing | |
| 3915 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | true | true | 3 | 20231225_023410__870 | 3 | 0.0 | 48.4545 | 2 | [71, 291] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_023410__870.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3916 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | true | true | 3 | 20231225_023458__641 | 3 | 0.0 | 48.2409 | 2 | [71, 290] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_023458__641.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3917 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | true | true | 3 | 20231225_163805__215 | 3 | 0.0 | 36.283 | 2 | [71, 216] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_163805__215.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3918 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | true | true | 3 | 20231225_163841__366 | 3 | 0.0 | 35.6127 | 2 | [71, 212] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_163841__366.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3919 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231225_023255__420 | 3 | 0.0 | 28.8166 | 2 | [74, 168] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_023255__420.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3920 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231225_023321__429 | 3 | 0.0 | 25.8871 | 2 | [74, 150] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_023321__429.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3921 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231225_163646__277 | 3 | 0.0 | 43.1555 | 2 | [74, 259] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_163646__277.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3922 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231225_163728__515 | 3 | 0.0 | 41.9905 | 2 | [74, 252] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_163728__515.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3923 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231226_231325__771 | 3 | 0.0 | 32.8815 | 2 | [74, 194] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231226_231325__771.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3924 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_023131__886 | 3 | 0.0 | 42.651 | 2 | [115, 248] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_023131__886.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3925 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_023226__419 | 0 | 0.0 | 53.6538 | 2 | [115, 316] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_023226__419.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3926 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_163531__762 | 3 | 0.0 | 31.3656 | 2 | [115, 179] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_163531__762.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3927 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_163602__804 | 3 | 0.0 | 30.8255 | 2 | [115, 176] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_163602__804.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3928 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231226_231252__164 | 3 | 0.0 | 42.0434 | 2 | [115, 246] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_231252__164.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3929 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_023029__835 | 3 | 0.0 | 69.078 | 2 | [197, 222] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_023029__835.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3930 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_023048__378 | 3 | 0.0 | 19.1021 | 2 | [197, 87] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_023048__378.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3931 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_163430__212 | 0 | 0.0 | 48.2711 | 2 | [197, 105] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_163430__212.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3932 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_163459__345 | 3 | 0.0 | 28.939 | 2 | [197, 149] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_163459__345.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3933 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231226_231210__873 | 0 | 0.0 | 75.213 | 2 | [197, 287] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_231210__873.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3934 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_023705__524 | 0 | 0.0 | 37.5656 | 2 | [403, 169] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_023705__524.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3935 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_023735__219 | 3 | 0.0 | 29.8091 | 2 | [403, 122] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_023735__219.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3936 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_164114__200 | 3 | 0.0 | 40.8368 | 2 | [403, 190] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_164114__200.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3937 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_164135__404 | 3 | 0.0 | 20.9223 | 2 | [403, 68] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_164135__404.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3938 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231226_231439__303 | 3 | 0.0 | 29.5377 | 2 | [403, 121] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_231439__303.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3939 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_023550__704 | 3 | 0.0 | 51.1249 | 2 | [401, 250] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_023550__704.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3940 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_023627__211 | 0 | 0.0 | 35.4669 | 2 | [401, 156] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_023627__211.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3941 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_163924__161 | 0 | 0.0 | 43.12 | 2 | [401, 204] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_163924__161.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3942 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_164032__143 | 3 | 0.0 | 68.5556 | 2 | [401, 357] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_164032__143.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3943 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231226_231409__520 | 3 | 0.0 | 43.7922 | 2 | [401, 208] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_231409__520.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3944 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 3 | 20231226_232017__632 | 0 | 0.0 | 7.32564 | 0 | [76, 282] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231226_232017__632.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3945 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | true | 3 | 20231227_110234__489 | 0 | 0.0 | 7.34088 | 2 | [76, 282] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_110234__489.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3946 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 3 | 20231227_110241__217 | 0 | 0.0 | 6.55351 | 0 | [76, 251] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_110241__217.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3947 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 3 | 20231227_110248__925 | 0 | 0.0 | 6.68985 | 0 | [76, 256] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_110248__925.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3948 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231226_232010__600 | 0 | 0.0 | 1.27291 | 0 | [113, 38] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_232010__600.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3949 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231227_110215__991 | 0 | 0.0 | 1.83226 | 0 | [113, 61] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_110215__991.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3950 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | false | 3 | 20231227_110220__342 | 0 | 0.0 | 5.389 | 0 | [113, 201] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_110220__342.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3951 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231227_110227__121 | 0 | 0.0 | 6.30594 | 0 | [113, 237] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_110227__121.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3952 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231226_232008__226 | 0 | 0.0 | 8.67766 | 0 | [193, 184] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_232008__226.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3953 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231227_110207__574 | 0 | 0.0 | 7.82199 | 0 | [193, 146] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_110207__574.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3954 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231227_110210__485 | 0 | 0.0 | 2.81742 | 0 | [193, 87] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_110210__485.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3955 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231227_110213__819 | 0 | 0.0 | 3.20243 | 0 | [193, 102] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_110213__819.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3956 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231226_232035__963 | 0 | 0.0 | 9.02436 | 0 | [365, 293] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_232035__963.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3957 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231227_110319__288 | 0 | 0.0 | 6.02211 | 0 | [365, 183] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_110319__288.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3958 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231227_110327__589 | 0 | 0.0 | 8.33132 | 0 | [365, 268] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_110327__589.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3959 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231227_110335__365 | 0 | 0.0 | 8.10514 | 0 | [365, 260] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_110335__365.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3960 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | false | 3 | 20231226_232026__386 | 0 | 0.0 | 9.38119 | 0 | [362, 306] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_232026__386.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3961 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 3 | 20231227_110300__374 | 0 | 0.0 | 12.2485 | 0 | [362, 407] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_110300__374.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3962 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | false | 3 | 20231227_110307__642 | 0 | 0.0 | 6.86567 | 0 | [362, 214] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_110307__642.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3963 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 3 | 20231227_110313__981 | 0 | 0.0 | 5.93543 | 0 | [362, 180] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_110313__981.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3964 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 3 | 20240217_105623__490 | 0 | 0.0 | 4.03007 | 2 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_105623__490.json | 75.0 | missing | missing | missing | |
| 3965 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemini-1.0-pro-latest | InJulia | 1SHOT | true | false | 3 | 20240217_105626__211 | 0 | 0.0 | 2.67184 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_105626__211.json | 25.0 | missing | missing | missing | |
| 3966 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 3 | 20240217_105627__680 | 3 | 0.0 | 1.93754 | 2 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_105627__680.json | 100.0 | missing | missing | missing | |
| 3967 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 3 | 20240217_105630__416 | 3 | 0.0 | 2.38807 | 2 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_105630__416.json | 100.0 | missing | missing | missing | |
| 3968 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 3 | 20240217_105632__702 | 3 | 0.0 | 2.10675 | 2 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_105632__702.json | 100.0 | missing | missing | missing | |
| 3969 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 3 | 20240217_105544__340 | 3 | 0.0 | 2.51057 | 2 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_105544__340.json | 100.0 | missing | missing | missing | |
| 3970 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | false | false | 3 | 20240217_105546__318 | 0 | 0.0 | 2.07874 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_105546__318.json | 0.0 | missing | missing | missing | |
| 3971 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | false | 3 | 20240217_105554__597 | 0 | 0.0 | 7.48137 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_105554__597.json | 25.0 | missing | missing | missing | |
| 3972 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 3 | 20240217_105558__960 | 3 | 0.0 | 4.30275 | 2 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_105558__960.json | 100.0 | missing | missing | missing | |
| 3973 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | false | 3 | 20240217_105600__689 | 0 | 0.0 | 2.15755 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_105600__689.json | 25.0 | missing | missing | missing | |
| 3974 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20240217_105513__884 | 0 | 0.0 | 2.78183 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_105513__884.json | 50.0 | missing | missing | missing | |
| 3975 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20240217_105515__414 | 3 | 0.0 | 2.25199 | 2 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_105515__414.json | 100.0 | missing | missing | missing | |
| 3976 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20240217_105517__997 | 3 | 0.0 | 2.6333 | 2 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_105517__997.json | 100.0 | missing | missing | missing | |
| 3977 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20240217_105520__855 | 0 | 0.0 | 2.01771 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_105520__855.json | 50.0 | missing | missing | missing | |
| 3978 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20240217_113821__202 | 0 | 0.0 | 8.08681 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_113821__202.json | 25.0 | missing | missing | missing | |
| 3979 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240217_105728__972 | 3 | 0.0 | 2.47889 | 2 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_105728__972.json | 100.0 | missing | missing | missing | |
| 3980 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240217_105731__554 | 0 | 0.0 | 2.61164 | 2 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_105731__554.json | 75.0 | missing | missing | missing | |
| 3981 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240217_105736__833 | 0 | 0.0 | 5.39369 | 2 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_105736__833.json | 75.0 | missing | missing | missing | |
| 3982 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20240217_105745__210 | 0 | 0.0 | 8.04275 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_105745__210.json | 25.0 | missing | missing | missing | |
| 3983 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20240217_105747__255 | 0 | 0.0 | 2.31148 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_105747__255.json | 0.0 | missing | missing | missing | |
| 3984 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 3 | 20240217_105653__827 | 0 | 0.0 | 2.57406 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_105653__827.json | 50.0 | missing | missing | missing | |
| 3985 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | false | 3 | 20240217_105656__820 | 0 | 0.0 | 3.36108 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_105656__820.json | 25.0 | missing | missing | missing | |
| 3986 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | false | 3 | 20240217_105704__115 | 0 | 0.0 | 7.71004 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_105704__115.json | 25.0 | missing | missing | missing | |
| 3987 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 3 | 20240217_105707__311 | 0 | 0.0 | 3.36778 | 2 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_105707__311.json | 75.0 | missing | missing | missing | |
| 3988 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | false | 3 | 20240217_105710__561 | 0 | 0.0 | 3.16932 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_105710__561.json | 25.0 | missing | missing | missing | |
| 3989 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 3 | 20240223_222713__512 | 0 | 0.0 | 13.183 | 0 | [0, 202] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_222713__512.json | 0.0 | missing | missing | missing | |
| 3990 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 3 | 20240223_222730__526 | 0 | 0.0 | 16.8549 | 0 | [0, 258] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_222730__526.json | 0.0 | missing | missing | missing | |
| 3991 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 3 | 20240223_222748__607 | 0 | 0.0 | 17.418 | 0 | [0, 268] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_222748__607.json | 0.0 | missing | missing | missing | |
| 3992 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 3 | 20240223_222808__613 | 0 | 0.0 | 20.0038 | 0 | [0, 310] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_222808__613.json | 0.0 | missing | missing | missing | |
| 3993 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | false | 3 | 20240223_222815__510 | 0 | 0.0 | 6.99828 | 0 | [0, 106] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_222815__510.json | 25.0 | missing | missing | missing | |
| 3994 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 3 | 20240223_222529__981 | 0 | 0.0 | 3.39513 | 0 | [0, 48] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_222529__981.json | 0.0 | missing | missing | missing | |
| 3995 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 3 | 20240223_222532__527 | 0 | 0.0 | 3.20281 | 0 | [0, 48] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_222532__527.json | 0.0 | missing | missing | missing | |
| 3996 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 3 | 20240223_222536__358 | 0 | 0.0 | 3.30698 | 0 | [0, 50] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_222536__358.json | 0.0 | missing | missing | missing | |
| 3997 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 3 | 20240223_222539__979 | 0 | 0.0 | 3.19056 | 0 | [0, 50] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_222539__979.json | 0.0 | missing | missing | missing | |
| 3998 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 3 | 20240223_222542__351 | 0 | 0.0 | 3.63655 | 0 | [0, 53] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_222542__351.json | 0.0 | missing | missing | missing | |
| 3999 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20240223_222353__477 | 0 | 0.0 | 16.1787 | 0 | [0, 247] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_222353__477.json | 0.0 | missing | missing | missing | |
| 4000 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20240223_222407__336 | 0 | 0.0 | 13.4391 | 0 | [0, 203] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_222407__336.json | 0.0 | missing | missing | missing | |
| 4001 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20240223_222425__403 | 0 | 0.0 | 18.2085 | 0 | [0, 278] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_222425__403.json | 0.0 | missing | missing | missing | |
| 4002 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20240223_222448__108 | 0 | 0.0 | 22.9597 | 0 | [0, 355] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_222448__108.json | 0.0 | missing | missing | missing | |
| 4003 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20240223_222509__574 | 0 | 0.0 | 21.0184 | 0 | [0, 320] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_222509__574.json | 0.0 | missing | missing | missing | |
| 4004 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20240223_223314__640 | 0 | 0.0 | 20.3623 | 0 | [0, 310] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_223314__640.json | 0.0 | missing | missing | missing | |
| 4005 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20240223_223335__927 | 0 | 0.0 | 20.5798 | 0 | [0, 313] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_223335__927.json | 0.0 | missing | missing | missing | |
| 4006 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20240223_223359__374 | 0 | 0.0 | 23.6784 | 0 | [0, 356] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_223359__374.json | 0.0 | missing | missing | missing | |
| 4007 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20240223_223418__950 | 0 | 0.0 | 19.3459 | 0 | [0, 295] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_223418__950.json | 0.0 | missing | missing | missing | |
| 4008 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20240223_223438__526 | 0 | 0.0 | 19.8343 | 0 | [0, 298] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_223438__526.json | 0.0 | missing | missing | missing | |
| 4009 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 3 | 20240223_223005__576 | 0 | 0.0 | 23.8501 | 0 | [0, 360] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_223005__576.json | 0.0 | missing | missing | missing | |
| 4010 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 3 | 20240223_223020__683 | 0 | 0.0 | 14.9333 | 0 | [0, 225] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_223020__683.json | 0.0 | missing | missing | missing | |
| 4011 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 3 | 20240223_223037__284 | 0 | 0.0 | 17.6153 | 0 | [0, 265] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_223037__284.json | 0.0 | missing | missing | missing | |
| 4012 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 3 | 20240223_223056__251 | 0 | 0.0 | 18.9843 | 0 | [0, 289] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_223056__251.json | 0.0 | missing | missing | missing | |
| 4013 | Apple-MacBook-Pro-M1 | FloatWithUnits | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 3 | 20240223_223113__335 | 0 | 0.0 | 16.5077 | 0 | [0, 250] | 0.13.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_223113__335.json | 0.0 | missing | missing | missing | |
| 4014 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 3 | 20231213_201351__337 | 0 | 0.000308 | 4.35373 | 0 | [67, 183] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231213_201351__337.json | 0.0 | missing | missing | missing | |
| 4015 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 3 | 20231225_190047__860 | 0 | 0.00044 | 4.91049 | 0 | [67, 271] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_190047__860.json | 0.0 | missing | missing | missing | |
| 4016 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 3 | 20231225_190050__697 | 0 | 0.000251 | 2.74617 | 0 | [67, 145] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_190050__697.json | 0.0 | missing | missing | missing | |
| 4017 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo--optim | AsIs | 1SHOT | false | false | 3 | 20231215_192409__756 | 0 | 0.0 | 4.99557 | 0 | [67, 219] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231215_192409__756.json | 0.0 | 0.5 | missing | 0.5 | |
| 4018 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 3 | 20231213_201347__299 | 3 | 0.0002765 | 4.15019 | 2 | [70, 161] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231213_201347__299.json | 100.0 | missing | missing | missing | |
| 4019 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 3 | 20231225_190039__866 | 3 | 0.0003545 | 4.08241 | 2 | [70, 213] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_190039__866.json | 100.0 | missing | missing | missing | |
| 4020 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 3 | 20231225_190042__872 | 3 | 0.0002555 | 2.8269 | 2 | [70, 147] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_190042__872.json | 100.0 | missing | missing | missing | |
| 4021 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 3 | 20231227_193534__250 | 3 | 0.0004985 | 5.65665 | 2 | [70, 309] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_193534__250.json | 100.0 | missing | missing | missing | |
| 4022 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 3 | 20231227_193538__562 | 3 | 0.0003005 | 3.63871 | 2 | [70, 177] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_193538__562.json | 100.0 | missing | missing | missing | |
| 4023 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo--optim | InJulia | 1SHOT | true | true | 3 | 20231215_192404__558 | 3 | 0.0 | 5.43684 | 2 | [70, 213] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231215_192404__558.json | 100.0 | 0.5 | missing | 0.5 | |
| 4024 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231213_201342__259 | 3 | 0.0003585 | 4.59126 | 2 | [105, 204] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231213_201342__259.json | 100.0 | missing | missing | missing | |
| 4025 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_190030__186 | 3 | 0.000216 | 1.93324 | 2 | [105, 109] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_190030__186.json | 100.0 | missing | missing | missing | |
| 4026 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_190035__236 | 3 | 0.0005025 | 5.28558 | 2 | [105, 300] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_190035__236.json | 100.0 | missing | missing | missing | |
| 4027 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_193524__736 | 3 | 0.000264 | 2.7605 | 2 | [105, 141] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_193524__736.json | 100.0 | missing | missing | missing | |
| 4028 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_193529__111 | 3 | 0.000417 | 4.62974 | 2 | [105, 243] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_193529__111.json | 100.0 | missing | missing | missing | |
| 4029 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo--optim | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231215_192359__279 | 3 | 0.0 | 3.79989 | 2 | [105, 147] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231215_192359__279.json | 100.0 | 0.5 | missing | 0.5 | |
| 4030 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231213_201338__499 | 3 | 0.000213 | 2.42776 | 2 | [174, 84] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231213_201338__499.json | 100.0 | missing | missing | missing | |
| 4031 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_190026__538 | 3 | 0.000306 | 2.79707 | 2 | [174, 146] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_190026__538.json | 100.0 | missing | missing | missing | |
| 4032 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_190027__131 | 3 | 0.000216 | 1.84858 | 2 | [174, 86] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_190027__131.json | 100.0 | missing | missing | missing | |
| 4033 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_193519__958 | 3 | 0.000207 | 1.92374 | 2 | [174, 80] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_193519__958.json | 100.0 | missing | missing | missing | |
| 4034 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_193521__152 | 3 | 0.0002115 | 1.84881 | 2 | [174, 83] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_193521__152.json | 100.0 | missing | missing | missing | |
| 4035 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo--optim | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231215_192355__584 | 3 | 0.0 | 3.07838 | 2 | [174, 98] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231215_192355__584.json | 100.0 | 0.5 | missing | 0.5 | |
| 4036 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231213_201354__107 | 0 | 0.0002105 | 1.19608 | 0 | [328, 31] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231213_201354__107.json | 0.0 | missing | missing | missing | |
| 4037 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_190054__517 | 0 | 0.0002435 | 1.12187 | 0 | [328, 53] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_190054__517.json | 0.0 | missing | missing | missing | |
| 4038 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_190056__389 | 0 | 0.00032 | 1.82303 | 0 | [328, 104] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_190056__389.json | 0.0 | missing | missing | missing | |
| 4039 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_193547__269 | 3 | 0.0005165 | 4.47638 | 2 | [328, 235] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_193547__269.json | 100.0 | missing | missing | missing | |
| 4040 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_193552__280 | 3 | 0.0005 | 4.17307 | 2 | [328, 224] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_193552__280.json | 100.0 | missing | missing | missing | |
| 4041 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo--optim | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231215_192420__553 | 3 | 0.0 | 6.02303 | 2 | [328, 254] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231215_192420__553.json | 100.0 | 0.5 | missing | 0.5 | |
| 4042 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 3 | 20231213_201353__650 | 0 | 0.0002445 | 1.71082 | 0 | [327, 54] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231213_201353__650.json | 0.0 | missing | missing | missing | |
| 4043 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_190051__850 | 0 | 0.0002535 | 1.54476 | 0 | [327, 60] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_190051__850.json | 0.0 | missing | missing | missing | |
| 4044 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_190053__715 | 0 | 0.000249 | 1.22739 | 0 | [327, 57] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_190053__715.json | 0.0 | missing | missing | missing | |
| 4045 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_193541__757 | 3 | 0.0003615 | 2.72118 | 2 | [327, 132] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_193541__757.json | 100.0 | missing | missing | missing | |
| 4046 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_193543__872 | 3 | 0.000303 | 1.84476 | 2 | [327, 93] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_193543__872.json | 100.0 | missing | missing | missing | |
| 4047 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo--optim | JuliaRecapTask | 1SHOT | true | true | 3 | 20231215_192414__576 | 3 | 0.0 | 4.42405 | 2 | [327, 154] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231215_192414__576.json | 100.0 | 0.5 | missing | 0.5 | |
| 4048 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 3 | 20240201_200315__709 | 3 | 0.000236 | 1.04464 | 2 | [70, 134] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200315__709.json | 100.0 | missing | missing | missing | |
| 4049 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 3 | 20240201_200316__205 | 3 | 0.0002855 | 1.36013 | 2 | [70, 167] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200316__205.json | 100.0 | missing | missing | missing | |
| 4050 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 3 | 20240201_200317__251 | 3 | 0.0002465 | 1.31435 | 2 | [70, 141] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200317__251.json | 100.0 | missing | missing | missing | |
| 4051 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 3 | 20240201_200319__863 | 3 | 0.0002975 | 1.30782 | 2 | [70, 175] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200319__863.json | 100.0 | missing | missing | missing | |
| 4052 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 3 | 20240201_200320__598 | 0 | 0.000263 | 1.23663 | 2 | [70, 152] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200320__598.json | 75.0 | missing | missing | missing | |
| 4053 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20240201_200311__448 | 3 | 0.000117 | 0.561062 | 2 | [105, 43] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200311__448.json | 100.0 | missing | missing | missing | |
| 4054 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20240201_200312__695 | 3 | 0.0001155 | 0.597908 | 2 | [105, 42] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200312__695.json | 100.0 | missing | missing | missing | |
| 4055 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20240201_200312__738 | 3 | 0.0001155 | 0.743367 | 2 | [105, 42] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200312__738.json | 100.0 | missing | missing | missing | |
| 4056 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20240201_200313__610 | 3 | 0.00012 | 0.559738 | 2 | [105, 45] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200313__610.json | 100.0 | missing | missing | missing | |
| 4057 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20240201_200314__906 | 3 | 0.000123 | 0.576569 | 2 | [105, 47] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200314__906.json | 100.0 | missing | missing | missing | |
| 4058 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20240201_200307__988 | 3 | 0.0002355 | 1.06838 | 2 | [174, 99] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200307__988.json | 100.0 | missing | missing | missing | |
| 4059 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20240201_200308__771 | 3 | 0.000219 | 1.03501 | 2 | [174, 88] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200308__771.json | 100.0 | missing | missing | missing | |
| 4060 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20240201_200309__310 | 3 | 0.000243 | 0.874583 | 2 | [174, 104] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200309__310.json | 100.0 | missing | missing | missing | |
| 4061 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20240201_200310__286 | 3 | 0.0002235 | 0.897525 | 2 | [174, 91] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200310__286.json | 100.0 | missing | missing | missing | |
| 4062 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20240201_200310__757 | 3 | 0.000213 | 0.81395 | 2 | [174, 84] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200310__757.json | 100.0 | missing | missing | missing | |
| 4063 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240201_200327__593 | 3 | 0.000272 | 1.00198 | 2 | [328, 72] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200327__593.json | 100.0 | missing | missing | missing | |
| 4064 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20240201_200328__119 | 0 | 0.000257 | 0.830602 | 0 | [328, 62] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200328__119.json | 0.0 | missing | missing | missing | |
| 4065 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240201_200329__565 | 0 | 0.000299 | 0.858325 | 2 | [328, 90] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200329__565.json | 75.0 | missing | missing | missing | |
| 4066 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240201_200330__416 | 3 | 0.0002975 | 0.87069 | 2 | [328, 89] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200330__416.json | 100.0 | missing | missing | missing | |
| 4067 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240201_200331__940 | 3 | 0.00026 | 0.850238 | 2 | [328, 64] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200331__940.json | 100.0 | missing | missing | missing | |
| 4068 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 3 | 20240201_200321__244 | 3 | 0.00027 | 0.745148 | 2 | [327, 71] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200321__244.json | 100.0 | missing | missing | missing | |
| 4069 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 3 | 20240201_200322__488 | 3 | 0.000276 | 0.93191 | 2 | [327, 75] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200322__488.json | 100.0 | missing | missing | missing | |
| 4070 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 3 | 20240201_200324__552 | 3 | 0.000276 | 2.55785 | 2 | [327, 75] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200324__552.json | 100.0 | missing | missing | missing | |
| 4071 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 3 | 20240201_200325__465 | 3 | 0.000315 | 0.965759 | 2 | [327, 101] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200325__465.json | 100.0 | missing | missing | missing | |
| 4072 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 3 | 20240201_200326__777 | 3 | 0.0003555 | 1.24121 | 2 | [327, 128] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200326__777.json | 100.0 | missing | missing | missing | |
| 4073 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 3 | 20231213_201407__598 | 0 | 0.000409 | 3.7833 | 0 | [67, 171] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231213_201407__598.json | 0.0 | missing | missing | missing | |
| 4074 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 3 | 20231225_190107__434 | 0 | 0.000323 | 2.10274 | 0 | [67, 128] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_190107__434.json | 0.0 | missing | missing | missing | |
| 4075 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 3 | 20231225_190109__214 | 0 | 0.000281 | 1.73271 | 0 | [67, 107] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_190109__214.json | 0.0 | missing | missing | missing | |
| 4076 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106--optim | AsIs | 1SHOT | false | false | 3 | 20231215_192433__155 | 0 | 0.0 | 6.34535 | 0 | [67, 183] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231215_192433__155.json | 0.0 | 0.9 | missing | 0.1 | |
| 4077 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 3 | 20231213_201403__725 | 3 | 0.000358 | 2.80146 | 2 | [70, 144] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231213_201403__725.json | 100.0 | missing | missing | missing | |
| 4078 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 3 | 20231225_190103__220 | 3 | 0.0003 | 1.61424 | 2 | [70, 115] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_190103__220.json | 100.0 | missing | missing | missing | |
| 4079 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 3 | 20231225_190105__542 | 3 | 0.000346 | 1.91892 | 2 | [70, 138] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_190105__542.json | 100.0 | missing | missing | missing | |
| 4080 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 3 | 20231227_193602__307 | 0 | 0.000344 | 2.96972 | 2 | [70, 137] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_193602__307.json | 75.0 | missing | missing | missing | |
| 4081 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 3 | 20231227_193607__128 | 3 | 0.000386 | 4.55263 | 2 | [70, 158] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_193607__128.json | 100.0 | missing | missing | missing | |
| 4082 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106--optim | InJulia | 1SHOT | true | true | 3 | 20231215_192426__601 | 3 | 0.0 | 2.12691 | 2 | [70, 103] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231215_192426__601.json | 100.0 | 0.9 | missing | 0.1 | |
| 4083 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231213_201400__662 | 3 | 0.000199 | 2.22722 | 2 | [105, 47] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231213_201400__662.json | 100.0 | missing | missing | missing | |
| 4084 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_190100__237 | 0 | 0.000199 | 1.0175 | 0 | [105, 47] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_190100__237.json | 0.0 | missing | missing | missing | |
| 4085 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_190101__280 | 3 | 0.000189 | 1.02258 | 2 | [105, 42] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_190101__280.json | 100.0 | missing | missing | missing | |
| 4086 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_193558__683 | 3 | 0.000199 | 2.17218 | 2 | [105, 47] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_193558__683.json | 100.0 | missing | missing | missing | |
| 4087 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_193559__715 | 3 | 0.000199 | 1.25106 | 2 | [105, 47] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_193559__715.json | 100.0 | missing | missing | missing | |
| 4088 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106--optim | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231215_192424__171 | 3 | 0.0 | 1.31695 | 2 | [105, 42] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231215_192424__171.json | 100.0 | 0.9 | missing | 0.1 | |
| 4089 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231213_201358__616 | 3 | 0.000364 | 3.23484 | 2 | [174, 95] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231213_201358__616.json | 100.0 | missing | missing | missing | |
| 4090 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_190057__433 | 3 | 0.000376 | 1.63006 | 2 | [174, 101] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_190057__433.json | 100.0 | missing | missing | missing | |
| 4091 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_190059__222 | 3 | 0.000356 | 1.57123 | 2 | [174, 91] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_190059__222.json | 100.0 | missing | missing | missing | |
| 4092 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_193554__383 | 3 | 0.000352 | 2.04377 | 2 | [174, 89] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_193554__383.json | 100.0 | missing | missing | missing | |
| 4093 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_193556__136 | 3 | 0.000374 | 1.85966 | 2 | [174, 100] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_193556__136.json | 100.0 | missing | missing | missing | |
| 4094 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106--optim | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231215_192423__791 | 3 | 0.0 | 2.56138 | 2 | [174, 84] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231215_192423__791.json | 100.0 | 0.9 | missing | 0.1 | |
| 4095 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231213_201411__634 | 0 | 0.000582 | 2.80682 | 0 | [328, 127] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231213_201411__634.json | 0.0 | missing | missing | missing | |
| 4096 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_190112__102 | 0 | 0.000464 | 1.16083 | 0 | [328, 68] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_190112__102.json | 0.0 | missing | missing | missing | |
| 4097 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_190114__167 | 0 | 0.00056 | 1.49353 | 2 | [328, 116] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_190114__167.json | 75.0 | missing | missing | missing | |
| 4098 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231227_193612__790 | 0 | 0.000574 | 2.2095 | 0 | [328, 123] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_193612__790.json | 0.0 | missing | missing | missing | |
| 4099 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_193614__803 | 0 | 0.000566 | 2.06064 | 2 | [328, 119] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_193614__803.json | 75.0 | missing | missing | missing | |
| 4100 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106--optim | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231215_192438__478 | 3 | 0.0 | 2.84246 | 2 | [328, 161] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231215_192438__478.json | 100.0 | 0.9 | missing | 0.1 | |
| 4101 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 3 | 20231213_201408__795 | 3 | 0.000451 | 1.12663 | 2 | [327, 62] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231213_201408__795.json | 100.0 | missing | missing | missing | |
| 4102 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_190110__300 | 3 | 0.000461 | 1.53971 | 2 | [327, 67] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_190110__300.json | 100.0 | missing | missing | missing | |
| 4103 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_190111__883 | 3 | 0.000465 | 1.06378 | 2 | [327, 69] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_190111__883.json | 100.0 | missing | missing | missing | |
| 4104 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_193608__202 | 3 | 0.000463 | 1.25294 | 2 | [327, 68] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_193608__202.json | 100.0 | missing | missing | missing | |
| 4105 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_193610__716 | 0 | 0.000457 | 1.78052 | 2 | [327, 65] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_193610__716.json | 75.0 | missing | missing | missing | |
| 4106 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-3.5-turbo-1106--optim | JuliaRecapTask | 1SHOT | true | true | 3 | 20231215_192435__988 | 3 | 0.0 | 2.49061 | 2 | [327, 63] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231215_192435__988.json | 100.0 | 0.9 | missing | 0.1 | |
| 4107 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-0125-preview | InJulia | 1SHOT | false | false | 3 | 20240201_082319__600 | 0 | 0.01087 | 16.0318 | 0 | [70, 339] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_082319__600.json | 0.0 | missing | missing | missing | |
| 4108 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 3 | 20240201_082354__226 | 3 | 0.01363 | 35.5825 | 2 | [70, 431] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_082354__226.json | 100.0 | missing | missing | missing | |
| 4109 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-0125-preview | InJulia | 1SHOT | false | false | 3 | 20240201_082416__218 | 0 | 0.00883 | 21.312 | 0 | [70, 271] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_082416__218.json | 0.0 | missing | missing | missing | |
| 4110 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-0125-preview | InJulia | 1SHOT | false | false | 3 | 20240201_082448__540 | 0 | 0.0148 | 32.0817 | 0 | [70, 470] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_082448__540.json | 0.0 | missing | missing | missing | |
| 4111 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 3 | 20240201_082526__150 | 3 | 0.01546 | 38.3007 | 2 | [70, 492] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_082526__150.json | 100.0 | missing | missing | missing | |
| 4112 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 3 | 20240201_082028__432 | 3 | 0.00255 | 4.08043 | 2 | [105, 50] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_082028__432.json | 100.0 | missing | missing | missing | |
| 4113 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 3 | 20240201_082032__994 | 3 | 0.00267 | 3.87664 | 2 | [105, 54] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_082032__994.json | 100.0 | missing | missing | missing | |
| 4114 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | false | false | 3 | 20240201_082036__630 | 0 | 0.00255 | 3.84267 | 0 | [105, 50] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_082036__630.json | 0.0 | missing | missing | missing | |
| 4115 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 3 | 20240201_082041__248 | 3 | 0.00255 | 5.22462 | 2 | [105, 50] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_082041__248.json | 100.0 | missing | missing | missing | |
| 4116 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 3 | 20240201_082046__632 | 3 | 0.00255 | 4.50915 | 2 | [105, 50] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_082046__632.json | 100.0 | missing | missing | missing | |
| 4117 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20240201_081820__716 | 0 | 0.0132 | 28.0614 | 0 | [174, 382] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_081820__716.json | 0.0 | missing | missing | missing | |
| 4118 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20240201_081841__175 | 3 | 0.01041 | 20.9157 | 2 | [174, 289] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_081841__175.json | 100.0 | missing | missing | missing | |
| 4119 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20240201_081913__742 | 0 | 0.0108 | 32.3746 | 0 | [174, 302] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_081913__742.json | 0.0 | missing | missing | missing | |
| 4120 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20240201_081950__522 | 0 | 0.01227 | 36.4833 | 0 | [174, 351] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_081950__522.json | 0.0 | missing | missing | missing | |
| 4121 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20240201_082005__663 | 3 | 0.00615 | 15.2144 | 2 | [174, 147] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_082005__663.json | 100.0 | missing | missing | missing | |
| 4122 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20240201_083445__832 | 0 | 0.01552 | 31.6696 | 0 | [328, 408] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_083445__832.json | 0.0 | missing | missing | missing | |
| 4123 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240201_083511__110 | 3 | 0.01093 | 26.2833 | 2 | [328, 255] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_083511__110.json | 100.0 | missing | missing | missing | |
| 4124 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240201_083546__880 | 3 | 0.01627 | 34.3835 | 2 | [328, 433] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_083546__880.json | 100.0 | missing | missing | missing | |
| 4125 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240201_083554__317 | 3 | 0.00652 | 8.00735 | 2 | [328, 108] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_083554__317.json | 100.0 | missing | missing | missing | |
| 4126 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20240201_083635__577 | 3 | 0.01723 | 40.6989 | 2 | [328, 465] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_083635__577.json | 100.0 | missing | missing | missing | |
| 4127 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 3 | 20240201_082917__616 | 3 | 0.01863 | 61.3972 | 2 | [327, 512] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_082917__616.json | 100.0 | missing | missing | missing | |
| 4128 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | false | false | 3 | 20240201_082942__238 | 0 | 0.01851 | 24.7249 | 0 | [327, 508] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_082942__238.json | 0.0 | missing | missing | missing | |
| 4129 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | false | false | 3 | 20240201_083008__835 | 0 | 0.01401 | 26.3924 | 0 | [327, 358] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_083008__835.json | 0.0 | missing | missing | missing | |
| 4130 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | false | false | 3 | 20240201_083052__971 | 0 | 0.01494 | 44.1101 | 0 | [327, 389] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_083052__971.json | 0.0 | missing | missing | missing | |
| 4131 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 3 | 20240201_083128__230 | 3 | 0.01719 | 35.0746 | 2 | [327, 464] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_083128__230.json | 100.0 | missing | missing | missing | |
| 4132 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 3 | 20231213_201530__515 | 0 | 0.01066 | 22.0485 | 0 | [67, 333] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231213_201530__515.json | 0.0 | missing | missing | missing | |
| 4133 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | AsIs | 1SHOT | true | true | 3 | 20231225_190219__685 | 3 | 0.00829 | 14.2929 | 2 | [67, 254] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_190219__685.json | 100.0 | missing | missing | missing | |
| 4134 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 3 | 20231225_190239__777 | 0 | 0.01303 | 20.6811 | 0 | [67, 412] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_190239__777.json | 0.0 | missing | missing | missing | |
| 4135 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview--optim | AsIs | 1SHOT | true | true | 3 | 20231215_192552__807 | 3 | 0.0 | 31.009 | 2 | [67, 260] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231215_192552__807.json | 100.0 | 0.1 | missing | 0.9 | |
| 4136 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 3 | 20231213_201508__452 | 3 | 0.0109 | 28.3259 | 2 | [70, 340] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231213_201508__452.json | 100.0 | missing | missing | missing | |
| 4137 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 3 | 20231225_190147__516 | 3 | 0.00871 | 12.1339 | 2 | [70, 267] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_190147__516.json | 100.0 | missing | missing | missing | |
| 4138 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 3 | 20231225_190204__186 | 3 | 0.00937 | 16.9192 | 2 | [70, 289] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_190204__186.json | 100.0 | missing | missing | missing | |
| 4139 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 3 | 20231227_193802__316 | 3 | 0.00805 | 14.339 | 2 | [70, 245] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_193802__316.json | 100.0 | missing | missing | missing | |
| 4140 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 3 | 20231227_193830__633 | 3 | 0.0079 | 27.5334 | 2 | [70, 240] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_193830__633.json | 100.0 | missing | missing | missing | |
| 4141 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview--optim | InJulia | 1SHOT | true | true | 3 | 20231215_192521__619 | 3 | 0.0 | 18.3145 | 2 | [70, 228] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231215_192521__619.json | 100.0 | 0.1 | missing | 0.9 | |
| 4142 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231213_201439__673 | 3 | 0.00531 | 10.0011 | 2 | [105, 142] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231213_201439__673.json | 100.0 | missing | missing | missing | |
| 4143 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_190132__596 | 0 | 0.00372 | 3.97022 | 0 | [105, 89] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_190132__596.json | 0.0 | missing | missing | missing | |
| 4144 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_190135__379 | 3 | 0.00264 | 3.32205 | 2 | [105, 53] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_190135__379.json | 100.0 | missing | missing | missing | |
| 4145 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_193744__227 | 3 | 0.00264 | 5.44411 | 2 | [105, 53] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_193744__227.json | 100.0 | missing | missing | missing | |
| 4146 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_193748__780 | 3 | 0.00264 | 3.44407 | 2 | [105, 53] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_193748__780.json | 100.0 | missing | missing | missing | |
| 4147 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview--optim | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231215_192502__682 | 3 | 0.0 | 6.94833 | 2 | [105, 53] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231215_192502__682.json | 100.0 | 0.1 | missing | 0.9 | |
| 4148 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231213_201429__676 | 0 | 0.00855 | 18.5154 | 0 | [174, 227] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231213_201429__676.json | 0.0 | missing | missing | missing | |
| 4149 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_190119__452 | 0 | 0.0057 | 5.10033 | 0 | [174, 132] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_190119__452.json | 0.0 | missing | missing | missing | |
| 4150 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_190128__897 | 0 | 0.00822 | 8.55882 | 0 | [174, 216] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_190128__897.json | 0.0 | missing | missing | missing | |
| 4151 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_193650__778 | 3 | 0.0165 | 35.4361 | 2 | [174, 492] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_193650__778.json | 100.0 | missing | missing | missing | |
| 4152 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_193739__109 | 3 | 0.00978 | 48.749 | 2 | [174, 268] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_193739__109.json | 100.0 | missing | missing | missing | |
| 4153 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview--optim | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231215_192455__903 | 3 | 0.0 | 17.1108 | 2 | [174, 158] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231215_192455__903.json | 100.0 | 0.1 | missing | 0.9 | |
| 4154 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231213_201557__519 | 3 | 0.00835 | 10.5255 | 2 | [328, 169] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231213_201557__519.json | 100.0 | missing | missing | missing | |
| 4155 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_190312__930 | 0 | 0.00823 | 11.7934 | 0 | [328, 165] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_190312__930.json | 0.0 | missing | missing | missing | |
| 4156 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_190323__790 | 0 | 0.01144 | 10.9281 | 0 | [328, 272] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_190323__790.json | 0.0 | missing | missing | missing | |
| 4157 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231227_193910__406 | 0 | 0.01069 | 12.364 | 0 | [328, 247] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_193910__406.json | 0.0 | missing | missing | missing | |
| 4158 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_193935__612 | 3 | 0.01129 | 24.7466 | 2 | [328, 267] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_193935__612.json | 100.0 | missing | missing | missing | |
| 4159 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview--optim | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231215_192636__175 | 3 | 0.0 | 16.6999 | 2 | [328, 209] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231215_192636__175.json | 100.0 | 0.1 | missing | 0.9 | |
| 4160 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 3 | 20231213_201546__207 | 3 | 0.01077 | 16.0927 | 2 | [327, 250] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231213_201546__207.json | 100.0 | missing | missing | missing | |
| 4161 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_190253__920 | 3 | 0.01104 | 13.5199 | 2 | [327, 259] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_190253__920.json | 100.0 | missing | missing | missing | |
| 4162 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_190300__703 | 3 | 0.00837 | 6.53129 | 2 | [327, 170] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_190300__703.json | 100.0 | missing | missing | missing | |
| 4163 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_193844__348 | 3 | 0.0102 | 13.8756 | 2 | [327, 231] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_193844__348.json | 100.0 | missing | missing | missing | |
| 4164 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_193858__823 | 3 | 0.01233 | 14.1149 | 2 | [327, 302] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_193858__823.json | 100.0 | missing | missing | missing | |
| 4165 | Apple-MacBook-Pro-M1 | FloatWithUnits | gpt-4-1106-preview--optim | JuliaRecapTask | 1SHOT | true | true | 3 | 20231215_192619__776 | 3 | 0.0 | 27.4166 | 2 | [327, 308] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231215_192619__776.json | 100.0 | 0.1 | missing | 0.9 | |
| 4166 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | AsIs | 1SHOT | false | false | 3 | 20231214_001428__451 | 0 | 0.0 | 8.66514 | 0 | [59, 263] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__AsIs__1SHOT__20231214_001428__451.json | 0.0 | missing | missing | missing | |
| 4167 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | AsIs | 1SHOT | false | false | 3 | 20231225_015950__630 | 0 | 0.0 | 8.26282 | 0 | [59, 250] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__AsIs__1SHOT__20231225_015950__630.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4168 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | AsIs | 1SHOT | false | false | 3 | 20231225_015959__767 | 0 | 0.0 | 9.25843 | 0 | [1, 292] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__AsIs__1SHOT__20231225_015959__767.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4169 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | AsIs | 1SHOT | false | false | 3 | 20231225_160522__798 | 0 | 0.0 | 11.6075 | 0 | [59, 352] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__AsIs__1SHOT__20231225_160522__798.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4170 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | AsIs | 1SHOT | false | false | 3 | 20231225_160527__931 | 0 | 0.0 | 5.22115 | 0 | [1, 168] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__AsIs__1SHOT__20231225_160527__931.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4171 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | InJulia | 1SHOT | true | true | 3 | 20231225_015934__630 | 0 | 0.0 | 6.59675 | 2 | [75, 194] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__InJulia__1SHOT__20231225_015934__630.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4172 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | InJulia | 1SHOT | false | false | 3 | 20231225_015941__417 | 0 | 0.0 | 7.12235 | 0 | [1, 226] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__InJulia__1SHOT__20231225_015941__417.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4173 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | InJulia | 1SHOT | false | false | 3 | 20231225_160503__191 | 0 | 0.0 | 8.52543 | 0 | [75, 255] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__InJulia__1SHOT__20231225_160503__191.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4174 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | InJulia | 1SHOT | false | false | 3 | 20231225_160510__344 | 0 | 0.0 | 6.88798 | 0 | [1, 221] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__InJulia__1SHOT__20231225_160510__344.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4175 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | InJulia | 1SHOT | false | false | 3 | 20231226_225753__710 | 0 | 0.0 | 8.33919 | 0 | [75, 252] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__InJulia__1SHOT__20231226_225753__710.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4176 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_015923__867 | 0 | 0.0 | 9.05526 | 0 | [105, 259] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_015923__867.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4177 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_015927__913 | 0 | 0.0 | 4.80797 | 0 | [1, 153] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_015927__913.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4178 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_160449__668 | 0 | 0.0 | 6.32191 | 0 | [105, 177] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_160449__668.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4179 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_160455__449 | 0 | 0.0 | 5.41566 | 0 | [1, 173] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_160455__449.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4180 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231226_225744__743 | 0 | 0.0 | 6.66344 | 0 | [105, 190] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaExpertAsk__1SHOT__20231226_225744__743.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4181 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_015905__286 | 0 | 0.0 | 13.0168 | 0 | [205, 196] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_015905__286.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4182 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_015914__427 | 0 | 0.0 | 8.77096 | 0 | [1, 266] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_015914__427.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4183 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_160437__489 | 0 | 0.0 | 13.1635 | 0 | [205, 201] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_160437__489.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4184 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_160443__949 | 0 | 0.0 | 6.14349 | 0 | [1, 190] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_160443__949.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4185 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231226_225738__953 | 0 | 0.0 | 14.9232 | 0 | [205, 270] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_225738__953.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4186 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_020048__516 | 0 | 0.0 | 13.9086 | 0 | [11, 384] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_020048__516.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4187 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_020101__172 | 0 | 0.0 | 12.6602 | 0 | [1, 356] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_020101__172.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4188 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_160628__754 | 0 | 0.0 | 15.4415 | 0 | [11, 426] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_160628__754.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4189 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_160644__465 | 0 | 0.0 | 16.1672 | 0 | [1, 451] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_160644__465.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4190 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231226_225837__899 | 0 | 0.0 | 15.8751 | 0 | [11, 442] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_225837__899.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4191 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_020021__946 | 0 | 0.0 | 22.2384 | 0 | [376, 518] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_020021__946.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4192 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_020034__750 | 0 | 0.0 | 13.1968 | 0 | [1, 371] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_020034__750.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4193 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_160550__825 | 0 | 0.0 | 23.0734 | 0 | [376, 542] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_160550__825.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4194 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaRecapTask | 1SHOT | true | false | 3 | 20231225_160612__314 | 0 | 0.0 | 22.125 | 0 | [1, 603] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_160612__314.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4195 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaRecapTask | 1SHOT | false | false | 3 | 20231226_225821__863 | 0 | 0.0 | 28.8316 | 0 | [376, 691] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaRecapTask__1SHOT__20231226_225821__863.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4196 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | AsIs | 1SHOT | false | false | 3 | 20231214_002147__797 | 0 | 0.0 | 8.08662 | 0 | [59, 244] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__AsIs__1SHOT__20231214_002147__797.json | 0.0 | missing | missing | missing | |
| 4197 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | AsIs | 1SHOT | false | false | 3 | 20231225_021903__154 | 0 | 0.0 | 5.5032 | 0 | [73, 175] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__AsIs__1SHOT__20231225_021903__154.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4198 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | AsIs | 1SHOT | false | false | 3 | 20231225_021908__569 | 0 | 0.0 | 5.90102 | 0 | [73, 189] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__AsIs__1SHOT__20231225_021908__569.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4199 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | AsIs | 1SHOT | false | false | 3 | 20231225_162306__989 | 0 | 0.0 | 5.01137 | 0 | [73, 159] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__AsIs__1SHOT__20231225_162306__989.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4200 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | AsIs | 1SHOT | false | false | 3 | 20231225_162314__418 | 0 | 0.0 | 7.98749 | 0 | [73, 261] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__AsIs__1SHOT__20231225_162314__418.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4201 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | InJulia | 1SHOT | true | true | 3 | 20231225_021852__907 | 3 | 0.0 | 4.72671 | 2 | [75, 148] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__InJulia__1SHOT__20231225_021852__907.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4202 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | InJulia | 1SHOT | true | true | 3 | 20231225_021857__833 | 3 | 0.0 | 5.11609 | 2 | [75, 161] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__InJulia__1SHOT__20231225_021857__833.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4203 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | InJulia | 1SHOT | true | true | 3 | 20231225_162250__528 | 3 | 0.0 | 9.05802 | 2 | [75, 297] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__InJulia__1SHOT__20231225_162250__528.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4204 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | InJulia | 1SHOT | false | false | 3 | 20231225_162301__220 | 0 | 0.0 | 10.4038 | 0 | [75, 342] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__InJulia__1SHOT__20231225_162301__220.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4205 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | InJulia | 1SHOT | true | true | 3 | 20231226_230546__404 | 3 | 0.0 | 7.47749 | 2 | [75, 242] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__InJulia__1SHOT__20231226_230546__404.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4206 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_021840__352 | 3 | 0.0 | 5.10172 | 2 | [115, 155] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_021840__352.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4207 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_021847__828 | 0 | 0.0 | 7.18337 | 2 | [115, 225] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_021847__828.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4208 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaExpertAsk | 1SHOT | true | false | 3 | 20231225_162235__822 | 0 | 0.0 | 7.1722 | 0 | [115, 228] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_162235__822.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4209 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_162241__346 | 3 | 0.0 | 6.23869 | 2 | [115, 195] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_162241__346.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4210 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231226_230539__758 | 3 | 0.0 | 7.01479 | 2 | [115, 221] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231226_230539__758.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4211 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_021823__846 | 3 | 0.0 | 10.9316 | 2 | [197, 135] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_021823__846.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4212 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_021835__637 | 3 | 0.0 | 11.1291 | 2 | [197, 338] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_021835__637.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4213 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_162219__132 | 3 | 0.0 | 12.2275 | 2 | [197, 180] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_162219__132.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4214 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_162227__601 | 3 | 0.0 | 8.26006 | 2 | [197, 247] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_162227__601.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4215 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231226_230531__307 | 3 | 0.0 | 11.2726 | 2 | [197, 155] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231226_230531__307.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4216 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_021939__615 | 3 | 0.0 | 13.0368 | 2 | [379, 368] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_021939__615.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4217 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_021943__484 | 3 | 0.0 | 3.58493 | 2 | [379, 65] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_021943__484.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4218 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_162337__152 | 3 | 0.0 | 6.62854 | 2 | [379, 166] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_162337__152.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4219 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_162344__973 | 3 | 0.0 | 7.31549 | 2 | [379, 188] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_162344__973.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4220 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231226_230604__310 | 0 | 0.0 | 10.5581 | 0 | [379, 291] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231226_230604__310.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4221 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_021918__920 | 3 | 0.0 | 9.91751 | 2 | [376, 270] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_021918__920.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4222 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_021926__470 | 3 | 0.0 | 7.6606 | 2 | [376, 198] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_021926__470.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4223 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_162323__494 | 3 | 0.0 | 9.02864 | 2 | [376, 244] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_162323__494.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4224 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_162330__274 | 3 | 0.0 | 7.22273 | 2 | [376, 185] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_162330__274.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4225 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaRecapTask | 1SHOT | true | true | 3 | 20231226_230554__717 | 3 | 0.0 | 7.43747 | 2 | [376, 191] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaRecapTask__1SHOT__20231226_230554__717.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4226 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 3 | 20231227_175906__967 | 3 | 0.0 | 11.6984 | 2 | [75, 225] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_175906__967.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4227 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 3 | 20231227_175916__260 | 3 | 0.0 | 10.0851 | 2 | [75, 193] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_175916__260.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4228 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 3 | 20231227_175925__727 | 3 | 0.0 | 8.52954 | 2 | [75, 162] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_175925__727.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4229 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | false | 3 | 20231227_175830__752 | 0 | 0.0 | 13.6456 | 0 | [115, 259] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_175830__752.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4230 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_175840__623 | 3 | 0.0 | 10.2506 | 2 | [115, 192] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_175840__623.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4231 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_175854__369 | 3 | 0.0 | 13.7144 | 2 | [115, 261] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_175854__369.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4232 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_175753__834 | 3 | 0.0 | 15.6918 | 2 | [197, 288] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_175753__834.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4233 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231227_175804__672 | 0 | 0.0 | 10.7005 | 0 | [197, 190] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_175804__672.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4234 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_175816__131 | 3 | 0.0 | 12.5344 | 2 | [197, 226] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_175816__131.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4235 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_180021__405 | 3 | 0.0 | 13.6214 | 2 | [379, 227] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_180021__405.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4236 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_180034__898 | 3 | 0.0 | 12.7945 | 2 | [379, 211] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_180034__898.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4237 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_180052__117 | 3 | 0.0 | 17.3143 | 2 | [379, 298] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_180052__117.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4238 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_175941__489 | 3 | 0.0 | 16.4235 | 2 | [376, 281] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_175941__489.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4239 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_175958__551 | 3 | 0.0 | 17.0007 | 2 | [376, 292] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_175958__551.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4240 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_180008__311 | 3 | 0.0 | 9.38031 | 2 | [376, 145] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_180008__311.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4241 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | AsIs | 1SHOT | false | false | 3 | 20231213_201905__251 | 0 | 0.00279668 | 46.1594 | 0 | [71, 322] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__AsIs__1SHOT__20231213_201905__251.json | 0.0 | missing | missing | missing | |
| 4242 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | AsIs | 1SHOT | false | false | 3 | 20231225_190539__838 | 0 | 0.00223038 | 5.62235 | 0 | [71, 252] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__AsIs__1SHOT__20231225_190539__838.json | 0.0 | missing | missing | missing | |
| 4243 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | AsIs | 1SHOT | false | false | 3 | 20231225_190549__792 | 0 | 0.00373512 | 10.4967 | 0 | [71, 438] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__AsIs__1SHOT__20231225_190549__792.json | 0.0 | missing | missing | missing | |
| 4244 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium--optim | AsIs | 1SHOT | false | false | 3 | 20231215_192941__292 | 0 | 0.0 | 37.4164 | 0 | [71, 243] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__AsIs__1SHOT__20231215_192941__292.json | 0.0 | 0.9 | missing | 0.3 | |
| 4245 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | InJulia | 1SHOT | true | true | 3 | 20231213_201819__277 | 3 | 0.0019769 | 26.6021 | 2 | [73, 220] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__InJulia__1SHOT__20231213_201819__277.json | 100.0 | missing | missing | missing | |
| 4246 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | InJulia | 1SHOT | true | true | 3 | 20231225_190526__730 | 3 | 0.00224387 | 5.55815 | 2 | [73, 253] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__InJulia__1SHOT__20231225_190526__730.json | 100.0 | missing | missing | missing | |
| 4247 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | InJulia | 1SHOT | true | true | 3 | 20231225_190533__782 | 3 | 0.0024623 | 6.25557 | 2 | [73, 280] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__InJulia__1SHOT__20231225_190533__782.json | 100.0 | missing | missing | missing | |
| 4248 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | InJulia | 1SHOT | true | true | 3 | 20231227_194146__609 | 3 | 0.00201735 | 16.8157 | 2 | [73, 225] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__InJulia__1SHOT__20231227_194146__609.json | 100.0 | missing | missing | missing | |
| 4249 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | InJulia | 1SHOT | true | true | 3 | 20231227_194201__460 | 3 | 0.00176656 | 13.9966 | 2 | [73, 194] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__InJulia__1SHOT__20231227_194201__460.json | 100.0 | missing | missing | missing | |
| 4250 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium--optim | InJulia | 1SHOT | true | true | 3 | 20231215_192903__976 | 3 | 0.0 | 33.2248 | 2 | [73, 217] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__InJulia__1SHOT__20231215_192903__976.json | 100.0 | 0.9 | missing | 0.3 | |
| 4251 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231213_201752__441 | 3 | 0.00186647 | 22.9869 | 2 | [113, 193] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231213_201752__441.json | 100.0 | missing | missing | missing | |
| 4252 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_190517__836 | 3 | 0.00189883 | 4.68202 | 2 | [113, 197] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_190517__836.json | 100.0 | missing | missing | missing | |
| 4253 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_190521__982 | 3 | 0.0015995 | 3.63959 | 2 | [113, 160] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_190521__982.json | 100.0 | missing | missing | missing | |
| 4254 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_194112__890 | 3 | 0.00172894 | 17.8772 | 2 | [113, 176] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_194112__890.json | 100.0 | missing | missing | missing | |
| 4255 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_194130__899 | 3 | 0.00226288 | 17.3405 | 2 | [113, 242] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_194130__899.json | 100.0 | missing | missing | missing | |
| 4256 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium--optim | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231215_192830__473 | 3 | 0.0 | 32.163 | 2 | [113, 184] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231215_192830__473.json | 100.0 | 0.9 | missing | 0.3 | |
| 4257 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231213_201729__557 | 3 | 0.00288069 | 34.522 | 2 | [195, 291] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231213_201729__557.json | 100.0 | missing | missing | missing | |
| 4258 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_190503__592 | 3 | 0.0023063 | 7.27973 | 2 | [195, 220] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_190503__592.json | 100.0 | missing | missing | missing | |
| 4259 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_190512__957 | 3 | 0.00351171 | 8.83063 | 2 | [195, 369] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_190512__957.json | 100.0 | missing | missing | missing | |
| 4260 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_194046__861 | 3 | 0.00300204 | 7.04104 | 2 | [195, 306] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_194046__861.json | 100.0 | missing | missing | missing | |
| 4261 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_194054__467 | 3 | 0.00334182 | 7.9311 | 2 | [195, 348] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_194054__467.json | 100.0 | missing | missing | missing | |
| 4262 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium--optim | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231215_192758__182 | 3 | 0.0 | 40.4722 | 2 | [195, 274] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231215_192758__182.json | 100.0 | 0.9 | missing | 0.3 | |
| 4263 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231213_202030__724 | 3 | 0.00333703 | 36.8888 | 2 | [376, 287] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231213_202030__724.json | 100.0 | missing | missing | missing | |
| 4264 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_190632__826 | 0 | 0.00426738 | 18.2317 | 2 | [376, 402] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_190632__826.json | 75.0 | missing | missing | missing | |
| 4265 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_190731__271 | 3 | 0.00525436 | 58.5668 | 2 | [376, 524] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_190731__271.json | 100.0 | missing | missing | missing | |
| 4266 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_194248__384 | 3 | 0.00266556 | 4.726 | 2 | [376, 204] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_194248__384.json | 100.0 | missing | missing | missing | |
| 4267 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_194253__969 | 3 | 0.00269792 | 4.76377 | 2 | [376, 208] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_194253__969.json | 100.0 | missing | missing | missing | |
| 4268 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium--optim | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231215_193133__999 | 3 | 0.0 | 47.0876 | 2 | [376, 295] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231215_193133__999.json | 100.0 | 0.9 | missing | 0.3 | |
| 4269 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 3 | 20231213_201953__780 | 3 | 0.00377388 | 47.9597 | 2 | [373, 342] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231213_201953__780.json | 100.0 | missing | missing | missing | |
| 4270 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_190558__308 | 3 | 0.00418647 | 9.01318 | 2 | [373, 393] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_190558__308.json | 100.0 | missing | missing | missing | |
| 4271 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_190613__635 | 3 | 0.00449389 | 14.9039 | 2 | [373, 431] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_190613__635.json | 100.0 | missing | missing | missing | |
| 4272 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_194220__298 | 3 | 0.00369298 | 19.7738 | 2 | [373, 332] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_194220__298.json | 100.0 | missing | missing | missing | |
| 4273 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_194244__754 | 0 | 0.00596627 | 23.0467 | 2 | [373, 613] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_194244__754.json | 75.0 | missing | missing | missing | |
| 4274 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-medium--optim | JuliaRecapTask | 1SHOT | true | true | 3 | 20231215_193045__938 | 3 | 0.0 | 64.3261 | 2 | [373, 416] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231215_193045__938.json | 100.0 | 0.9 | missing | 0.3 | |
| 4275 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | AsIs | 1SHOT | false | false | 3 | 20231213_201642__964 | 0 | 0.00047791 | 5.42247 | 0 | [70, 223] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__AsIs__1SHOT__20231213_201642__964.json | 0.0 | missing | missing | missing | |
| 4276 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | AsIs | 1SHOT | false | false | 3 | 20231225_190430__198 | 0 | 0.00028197 | 1.86471 | 0 | [70, 122] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__AsIs__1SHOT__20231225_190430__198.json | 0.0 | missing | missing | missing | |
| 4277 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | AsIs | 1SHOT | false | false | 3 | 20231225_190433__586 | 0 | 0.00038479 | 2.53595 | 0 | [70, 175] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__AsIs__1SHOT__20231225_190433__586.json | 0.0 | missing | missing | missing | |
| 4278 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small--optim | AsIs | 1SHOT | false | false | 3 | 20231215_192708__146 | 0 | 0.0 | 2.77919 | 0 | [70, 164] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__AsIs__1SHOT__20231215_192708__146.json | 0.0 | 0.9 | missing | 0.3 | |
| 4279 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | InJulia | 1SHOT | true | true | 3 | 20231213_201637__163 | 3 | 0.000653804 | 7.36528 | 2 | [72, 313] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__InJulia__1SHOT__20231213_201637__163.json | 100.0 | missing | missing | missing | |
| 4280 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | InJulia | 1SHOT | true | true | 3 | 20231225_190423__280 | 3 | 0.000675144 | 13.7425 | 2 | [72, 324] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__InJulia__1SHOT__20231225_190423__280.json | 100.0 | missing | missing | missing | |
| 4281 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | InJulia | 1SHOT | true | true | 3 | 20231225_190428__337 | 3 | 0.000743044 | 4.92966 | 2 | [72, 359] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__InJulia__1SHOT__20231225_190428__337.json | 100.0 | missing | missing | missing | |
| 4282 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | InJulia | 1SHOT | true | true | 3 | 20231227_194016__132 | 3 | 0.000498604 | 3.2152 | 2 | [72, 233] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__InJulia__1SHOT__20231227_194016__132.json | 100.0 | missing | missing | missing | |
| 4283 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | InJulia | 1SHOT | true | true | 3 | 20231227_194020__431 | 3 | 0.000671264 | 4.36134 | 2 | [72, 322] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__InJulia__1SHOT__20231227_194020__431.json | 100.0 | missing | missing | missing | |
| 4284 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small--optim | InJulia | 1SHOT | true | true | 3 | 20231215_192705__683 | 3 | 0.0 | 5.67072 | 2 | [72, 363] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__InJulia__1SHOT__20231215_192705__683.json | 100.0 | 0.9 | missing | 0.3 | |
| 4285 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | JuliaExpertAsk | 1SHOT | true | false | 3 | 20231213_201629__931 | 0 | 0.000424898 | 4.29308 | 0 | [114, 181] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231213_201629__931.json | 25.0 | missing | missing | missing | |
| 4286 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | JuliaExpertAsk | 1SHOT | true | false | 3 | 20231225_190406__764 | 0 | 0.000386098 | 2.26549 | 0 | [114, 161] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_190406__764.json | 25.0 | missing | missing | missing | |
| 4287 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_190409__231 | 3 | 0.000389978 | 3.12984 | 2 | [114, 163] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_190409__231.json | 100.0 | missing | missing | missing | |
| 4288 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_194011__990 | 3 | 0.000370578 | 2.14168 | 2 | [114, 153] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_194011__990.json | 100.0 | missing | missing | missing | |
| 4289 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_194013__463 | 3 | 0.000335658 | 1.93681 | 2 | [114, 135] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_194013__463.json | 100.0 | missing | missing | missing | |
| 4290 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small--optim | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231215_192700__765 | 3 | 0.0 | 2.67659 | 2 | [114, 194] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231215_192700__765.json | 100.0 | 0.9 | missing | 0.3 | |
| 4291 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231213_201624__234 | 3 | 0.000649965 | 6.33972 | 2 | [195, 270] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231213_201624__234.json | 100.0 | missing | missing | missing | |
| 4292 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_190400__711 | 3 | 0.000673245 | 3.95981 | 2 | [195, 282] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_190400__711.json | 100.0 | missing | missing | missing | |
| 4293 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_190403__172 | 3 | 0.000611165 | 3.42681 | 2 | [195, 250] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_190403__172.json | 100.0 | missing | missing | missing | |
| 4294 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231227_194004__268 | 0 | 0.000696525 | 4.17072 | 0 | [195, 294] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_194004__268.json | 25.0 | missing | missing | missing | |
| 4295 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231227_194008__947 | 0 | 0.000758605 | 4.38071 | 0 | [195, 326] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_194008__947.json | 25.0 | missing | missing | missing | |
| 4296 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small--optim | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231215_192657__207 | 3 | 0.0 | 3.5916 | 2 | [195, 271] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231215_192657__207.json | 100.0 | 0.9 | missing | 0.3 | |
| 4297 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231213_201654__629 | 3 | 0.00072504 | 3.49275 | 2 | [380, 247] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231213_201654__629.json | 100.0 | missing | missing | missing | |
| 4298 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_190451__766 | 0 | 0.00103738 | 5.6729 | 0 | [380, 408] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_190451__766.json | 0.0 | missing | missing | missing | |
| 4299 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231225_190456__160 | 0 | 0.0008686 | 4.61515 | 0 | [380, 321] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_190456__160.json | 25.0 | missing | missing | missing | |
| 4300 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_194031__291 | 3 | 0.00061058 | 2.76354 | 2 | [380, 188] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_194031__291.json | 100.0 | missing | missing | missing | |
| 4301 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231227_194039__405 | 0 | 0.00132256 | 7.46697 | 0 | [380, 555] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_194039__405.json | 0.0 | missing | missing | missing | |
| 4302 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small--optim | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231215_192717__872 | 0 | 0.0 | 3.87873 | 0 | [380, 287] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231215_192717__872.json | 0.0 | 0.9 | missing | 0.3 | |
| 4303 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | JuliaRecapTask | 1SHOT | true | true | 3 | 20231213_201651__524 | 0 | 0.00121069 | 8.59987 | 2 | [378, 498] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231213_201651__524.json | 75.0 | missing | missing | missing | |
| 4304 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | JuliaRecapTask | 1SHOT | true | false | 3 | 20231225_190438__436 | 0 | 0.000942966 | 5.64682 | 0 | [378, 360] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_190438__436.json | 25.0 | missing | missing | missing | |
| 4305 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | JuliaRecapTask | 1SHOT | true | false | 3 | 20231225_190445__497 | 0 | 0.00109041 | 7.12184 | 0 | [378, 436] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_190445__497.json | 25.0 | missing | missing | missing | |
| 4306 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_194024__330 | 3 | 0.000795526 | 4.0353 | 2 | [378, 284] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_194024__330.json | 100.0 | missing | missing | missing | |
| 4307 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_194028__566 | 3 | 0.000812986 | 4.04116 | 2 | [378, 293] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_194028__566.json | 100.0 | missing | missing | missing | |
| 4308 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-small--optim | JuliaRecapTask | 1SHOT | true | false | 3 | 20231215_192713__875 | 0 | 0.0 | 5.11686 | 0 | [378, 384] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231215_192713__875.json | 25.0 | 0.9 | missing | 0.3 | |
| 4309 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | AsIs | 1SHOT | false | false | 3 | 20231213_201610__323 | 0 | 0.000135281 | 4.56583 | 0 | [70, 277] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__AsIs__1SHOT__20231213_201610__323.json | 0.0 | missing | missing | missing | |
| 4310 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | AsIs | 1SHOT | false | false | 3 | 20231225_190342__914 | 0 | 0.000117614 | 2.20826 | 0 | [70, 238] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__AsIs__1SHOT__20231225_190342__914.json | 0.0 | missing | missing | missing | |
| 4311 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | AsIs | 1SHOT | false | false | 3 | 20231225_190344__254 | 0 | 0.000108101 | 2.04038 | 0 | [70, 217] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__AsIs__1SHOT__20231225_190344__254.json | 0.0 | missing | missing | missing | |
| 4312 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny--optim | AsIs | 1SHOT | false | false | 3 | 20231215_192646__111 | 0 | 0.0 | 2.90839 | 0 | [70, 342] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__AsIs__1SHOT__20231215_192646__111.json | 0.0 | 0.9 | missing | 0.3 | |
| 4313 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | InJulia | 1SHOT | true | true | 3 | 20231213_201605__483 | 3 | 0.000104304 | 2.96003 | 2 | [72, 208] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__InJulia__1SHOT__20231213_201605__483.json | 100.0 | missing | missing | missing | |
| 4314 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | InJulia | 1SHOT | true | true | 3 | 20231225_190337__338 | 3 | 0.000126501 | 2.35889 | 2 | [72, 257] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__InJulia__1SHOT__20231225_190337__338.json | 100.0 | missing | missing | missing | |
| 4315 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | InJulia | 1SHOT | true | true | 3 | 20231225_190339__585 | 3 | 0.000146886 | 2.67832 | 2 | [72, 302] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__InJulia__1SHOT__20231225_190339__585.json | 100.0 | missing | missing | missing | |
| 4316 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | InJulia | 1SHOT | true | true | 3 | 20231227_193947__718 | 3 | 0.00010068 | 2.12927 | 2 | [72, 200] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__InJulia__1SHOT__20231227_193947__718.json | 100.0 | missing | missing | missing | |
| 4317 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny--optim | InJulia | 1SHOT | true | true | 3 | 20231215_192643__680 | 3 | 0.0 | 3.36261 | 2 | [72, 394] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__InJulia__1SHOT__20231215_192643__680.json | 100.0 | 0.9 | missing | 0.3 | |
| 4318 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231213_201602__481 | 3 | 5.673e-5 | 0.962979 | 2 | [114, 90] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231213_201602__481.json | 100.0 | missing | missing | missing | |
| 4319 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_190333__328 | 3 | 6.4884e-5 | 1.12689 | 2 | [114, 108] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_190333__328.json | 100.0 | missing | missing | missing | |
| 4320 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_190334__862 | 3 | 4.5405e-5 | 0.827738 | 2 | [114, 65] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_190334__862.json | 100.0 | missing | missing | missing | |
| 4321 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_193944__396 | 3 | 4.3593e-5 | 0.667217 | 2 | [114, 61] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_193944__396.json | 100.0 | missing | missing | missing | |
| 4322 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_193945__279 | 3 | 4.5858e-5 | 0.778097 | 2 | [114, 66] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_193945__279.json | 100.0 | missing | missing | missing | |
| 4323 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny--optim | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231215_192640__687 | 3 | 0.0 | 0.652998 | 2 | [114, 65] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231215_192640__687.json | 100.0 | 0.9 | missing | 0.3 | |
| 4324 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231213_201601__203 | 0 | 0.000127866 | 4.49547 | 0 | [195, 222] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231213_201601__203.json | 25.0 | missing | missing | missing | |
| 4325 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_190330__689 | 3 | 0.000156405 | 7.66506 | 2 | [195, 285] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_190330__689.json | 100.0 | missing | missing | missing | |
| 4326 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_190332__842 | 0 | 0.000110652 | 1.81376 | 0 | [195, 184] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_190332__842.json | 0.0 | missing | missing | missing | |
| 4327 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_193941__214 | 3 | 9.2532e-5 | 6.2136 | 2 | [195, 144] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_193941__214.json | 100.0 | missing | missing | missing | |
| 4328 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_193943__144 | 3 | 0.000124242 | 1.95953 | 2 | [195, 214] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_193943__144.json | 100.0 | missing | missing | missing | |
| 4329 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny--optim | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231215_192639__150 | 3 | 0.0 | 3.02828 | 2 | [195, 122] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231215_192639__150.json | 100.0 | 0.9 | missing | 0.3 | |
| 4330 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231213_201618__989 | 0 | 0.000157843 | 3.87493 | 0 | [380, 231] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231213_201618__989.json | 25.0 | missing | missing | missing | |
| 4331 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231225_190353__840 | 0 | 0.000171886 | 2.39297 | 0 | [380, 262] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_190353__840.json | 25.0 | missing | missing | missing | |
| 4332 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_190356__738 | 3 | 0.00019363 | 2.82533 | 2 | [380, 310] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_190356__738.json | 100.0 | missing | missing | missing | |
| 4333 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_193957__389 | 3 | 0.000237571 | 3.74045 | 2 | [380, 407] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_193957__389.json | 100.0 | missing | missing | missing | |
| 4334 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_194000__748 | 3 | 0.000162373 | 2.27212 | 2 | [380, 241] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_194000__748.json | 100.0 | missing | missing | missing | |
| 4335 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny--optim | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231215_192653__571 | 3 | 0.0 | 3.66572 | 2 | [380, 404] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231215_192653__571.json | 100.0 | 0.9 | missing | 0.3 | |
| 4336 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 3 | 20231213_201614__470 | 3 | 0.000167076 | 3.97822 | 2 | [378, 252] | 0.10.0-DEV | 2 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231213_201614__470.json | 100.0 | missing | missing | missing | |
| 4337 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | JuliaRecapTask | 1SHOT | true | false | 3 | 20231225_190347__328 | 0 | 0.000210111 | 3.05454 | 0 | [378, 347] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_190347__328.json | 25.0 | missing | missing | missing | |
| 4338 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_190351__248 | 3 | 0.000235026 | 3.59609 | 2 | [378, 402] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_190351__248.json | 100.0 | missing | missing | missing | |
| 4339 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | JuliaRecapTask | 1SHOT | true | false | 3 | 20231227_193950__537 | 0 | 0.000203316 | 3.13209 | 0 | [378, 332] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_193950__537.json | 25.0 | missing | missing | missing | |
| 4340 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_193954__908 | 3 | 0.00020694 | 3.26588 | 2 | [378, 340] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_193954__908.json | 100.0 | missing | missing | missing | |
| 4341 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral-tiny--optim | JuliaRecapTask | 1SHOT | true | true | 3 | 20231215_192649__465 | 3 | 0.0 | 2.81124 | 2 | [378, 312] | 0.10.0-DEV | 2 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231215_192649__465.json | 100.0 | 0.9 | missing | 0.3 | |
| 4342 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 3 | 20231219_225023__390 | 0 | 0.0 | 8.30335 | 0 | [1, 264] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_225023__390.json | 0.0 | missing | missing | missing | |
| 4343 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 3 | 20231225_024048__341 | 0 | 0.0 | 4.49326 | 0 | [68, 105] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_024048__341.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4344 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 3 | 20231225_024053__724 | 0 | 0.0 | 4.63649 | 0 | [68, 109] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_024053__724.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4345 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 3 | 20231225_164523__295 | 0 | 0.0 | 2.74266 | 0 | [68, 59] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_164523__295.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4346 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 3 | 20231225_164527__446 | 0 | 0.0 | 3.41629 | 0 | [68, 77] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_164527__446.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4347 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 3 | 20231225_024039__568 | 0 | 0.0 | 3.19417 | 0 | [71, 71] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_024039__568.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4348 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231225_024044__853 | 0 | 0.0 | 4.87036 | 2 | [71, 115] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_024044__853.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4349 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 3 | 20231225_164517__985 | 0 | 0.0 | 2.54055 | 0 | [71, 54] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_164517__985.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4350 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231225_164521__549 | 0 | 0.0 | 3.75451 | 2 | [71, 86] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_164521__549.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4351 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 3 | 20231226_231607__845 | 0 | 0.0 | 3.88225 | 0 | [71, 89] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231226_231607__845.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4352 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_024032__198 | 0 | 0.0 | 2.53663 | 2 | [113, 49] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_024032__198.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4353 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 3 | 20231225_024035__903 | 0 | 0.0 | 3.1098 | 0 | [113, 64] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_024035__903.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4354 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_164512__437 | 0 | 0.0 | 2.50448 | 2 | [113, 48] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_164512__437.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4355 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_164514__767 | 0 | 0.0 | 2.53264 | 2 | [113, 49] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_164514__767.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4356 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231226_231603__454 | 0 | 0.0 | 2.95672 | 0 | [113, 60] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_231603__454.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4357 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_024025__458 | 0 | 0.0 | 10.2196 | 0 | [194, 86] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_024025__458.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4358 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231225_024030__224 | 0 | 0.0 | 4.56892 | 0 | [194, 89] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_024030__224.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4359 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231225_164502__759 | 0 | 0.0 | 11.0954 | 0 | [194, 115] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_164502__759.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4360 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_164509__416 | 0 | 0.0 | 7.21817 | 2 | [194, 158] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_164509__416.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4361 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231226_231600__850 | 0 | 0.0 | 13.7407 | 2 | [194, 188] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_231600__850.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4362 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231225_024136__402 | 0 | 0.0 | 16.2947 | 0 | [380, 360] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_024136__402.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4363 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231225_024150__155 | 0 | 0.0 | 13.9509 | 0 | [380, 302] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_024150__155.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4364 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_164549__898 | 0 | 0.0 | 7.21815 | 2 | [380, 134] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_164549__898.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4365 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231225_164555__764 | 0 | 0.0 | 6.32119 | 0 | [380, 111] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_164555__764.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4366 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231226_231635__428 | 0 | 0.0 | 12.2722 | 0 | [380, 260] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_231635__428.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4367 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_024108__778 | 0 | 0.0 | 15.188 | 0 | [378, 332] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_024108__778.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4368 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_024119__686 | 0 | 0.0 | 11.3201 | 2 | [378, 236] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_024119__686.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4369 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_164535__658 | 0 | 0.0 | 7.92794 | 2 | [378, 152] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_164535__658.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4370 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_164542__104 | 0 | 0.0 | 6.99707 | 0 | [378, 128] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_164542__104.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4371 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 3 | 20231226_231623__130 | 0 | 0.0 | 15.6173 | 0 | [378, 343] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_231623__130.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4372 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 3 | 20231227_231104__982 | 3 | 0.0 | 9.11696 | 2 | [70, 287] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_231104__982.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4373 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 3 | 20231227_231112__815 | 3 | 0.0 | 7.56339 | 2 | [70, 236] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_231112__815.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4374 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 3 | 20231227_231119__269 | 3 | 0.0 | 6.90375 | 2 | [70, 213] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_231119__269.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4375 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 3 | 20231227_231128__692 | 3 | 0.0 | 9.0182 | 2 | [70, 283] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_231128__692.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4376 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 3 | 20231227_231133__263 | 3 | 0.0 | 5.38113 | 2 | [70, 165] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_231133__263.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4377 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_231045__993 | 0 | 0.0 | 3.04765 | 2 | [112, 82] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_231045__993.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4378 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_231047__417 | 3 | 0.0 | 2.54302 | 2 | [112, 65] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_231047__417.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4379 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_231050__927 | 3 | 0.0 | 2.53481 | 2 | [112, 65] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_231050__927.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4380 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_231053__145 | 3 | 0.0 | 2.55616 | 2 | [112, 65] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_231053__145.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4381 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_231055__777 | 3 | 0.0 | 2.52786 | 2 | [112, 65] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_231055__777.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4382 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231227_231015__987 | 0 | 0.0 | 5.65702 | 0 | [193, 126] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231015__987.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4383 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_231021__282 | 0 | 0.0 | 6.15233 | 2 | [193, 169] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231021__282.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4384 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_231029__319 | 0 | 0.0 | 8.55397 | 2 | [193, 247] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231029__319.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4385 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_231039__858 | 0 | 0.0 | 9.32394 | 2 | [193, 271] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231039__858.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4386 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_231042__715 | 0 | 0.0 | 2.8811 | 2 | [193, 62] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231042__715.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4387 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231227_231248__186 | 0 | 0.0 | 11.0062 | 0 | [379, 294] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_231248__186.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4388 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_231258__284 | 0 | 0.0 | 10.0841 | 2 | [379, 265] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_231258__284.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4389 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_231311__976 | 0 | 0.0 | 13.0121 | 2 | [379, 356] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_231311__976.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4390 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_231320__924 | 0 | 0.0 | 8.14404 | 2 | [379, 205] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_231320__924.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4391 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231227_231330__938 | 0 | 0.0 | 10.5517 | 0 | [379, 280] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_231330__938.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4392 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_231145__350 | 0 | 0.0 | 11.7938 | 2 | [377, 318] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_231145__350.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4393 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_231155__573 | 0 | 0.0 | 10.1525 | 2 | [377, 268] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_231155__573.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4394 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_231213__651 | 3 | 0.0 | 17.2689 | 2 | [377, 486] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_231213__651.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4395 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 3 | 20231227_231224__867 | 0 | 0.0 | 11.6653 | 0 | [377, 314] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_231224__867.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4396 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 3 | 20231227_231237__823 | 0 | 0.0 | 12.7693 | 0 | [377, 349] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_231237__823.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4397 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231227_231439__387 | 3 | 0.0 | 12.6618 | 2 | [70, 316] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_231439__387.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4398 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 3 | 20231227_231450__558 | 0 | 0.0 | 10.3246 | 0 | [70, 256] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_231450__558.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4399 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231227_231458__591 | 3 | 0.0 | 8.38527 | 2 | [70, 206] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_231458__591.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4400 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231227_231505__166 | 3 | 0.0 | 7.34028 | 2 | [70, 179] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_231505__166.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4401 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 3 | 20231227_231518__395 | 0 | 0.0 | 12.0791 | 0 | [70, 301] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_231518__395.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4402 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_231414__297 | 3 | 0.0 | 3.16388 | 2 | [112, 65] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_231414__297.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4403 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_231417__971 | 3 | 0.0 | 3.15585 | 2 | [112, 65] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_231417__971.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4404 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_231420__721 | 3 | 0.0 | 3.16775 | 2 | [112, 65] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_231420__721.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4405 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_231423__307 | 3 | 0.0 | 3.15599 | 2 | [112, 65] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_231423__307.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4406 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_231427__150 | 3 | 0.0 | 3.16834 | 2 | [112, 65] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_231427__150.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4407 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231227_231336__261 | 0 | 0.0 | 5.86824 | 0 | [193, 101] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231336__261.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4408 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_231341__356 | 3 | 0.0 | 5.04488 | 2 | [193, 101] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231341__356.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4409 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_231352__849 | 3 | 0.0 | 10.9519 | 2 | [193, 252] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231352__849.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4410 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_231407__234 | 3 | 0.0 | 15.1453 | 2 | [193, 358] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231407__234.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4411 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_231411__513 | 0 | 0.0 | 3.38626 | 2 | [193, 58] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231411__513.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4412 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_231648__298 | 0 | 0.0 | 9.38932 | 2 | [379, 187] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_231648__298.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4413 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231227_231659__565 | 0 | 0.0 | 10.7949 | 0 | [379, 222] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_231659__565.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4414 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_231712__988 | 0 | 0.0 | 13.1782 | 2 | [379, 281] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_231712__988.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4415 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231227_231725__873 | 0 | 0.0 | 12.6866 | 0 | [379, 269] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_231725__873.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4416 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231227_231737__626 | 0 | 0.0 | 12.8111 | 0 | [379, 272] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_231737__626.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4417 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 3 | 20231227_231532__264 | 0 | 0.0 | 14.465 | 0 | [377, 313] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_231532__264.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4418 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_231545__431 | 0 | 0.0 | 13.415 | 2 | [377, 287] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_231545__431.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4419 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 3 | 20231227_231605__819 | 0 | 0.0 | 19.5552 | 0 | [377, 437] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_231605__819.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4420 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_231624__517 | 3 | 0.0 | 18.7995 | 2 | [377, 419] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_231624__517.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4421 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 3 | 20231227_231638__212 | 0 | 0.0 | 14.6301 | 0 | [377, 317] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_231638__212.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4422 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | false | false | 3 | 20231226_121028__745 | 0 | 0.0 | 11.0248 | 0 | [67, 197] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_121028__745.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4423 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | false | false | 3 | 20231226_121045__489 | 0 | 0.0 | 16.2181 | 0 | [67, 290] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_121045__489.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4424 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | false | 3 | 20231226_121005__633 | 0 | 0.0 | 14.2311 | 0 | [70, 256] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_121005__633.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4425 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 3 | 20231226_121017__843 | 3 | 0.0 | 11.9423 | 2 | [70, 214] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_121017__843.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4426 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 3 | 20231226_231924__951 | 3 | 0.0 | 19.5435 | 2 | [70, 359] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_231924__951.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4427 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231226_120947__277 | 3 | 0.0 | 4.28954 | 2 | [112, 66] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_120947__277.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4428 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231226_120951__990 | 3 | 0.0 | 4.24373 | 2 | [112, 65] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_120951__990.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4429 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231226_231904__164 | 3 | 0.0 | 4.18562 | 2 | [112, 66] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_231904__164.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4430 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231226_120936__534 | 3 | 0.0 | 4.33717 | 2 | [193, 57] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_120936__534.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4431 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231226_120942__959 | 3 | 0.0 | 6.51527 | 2 | [193, 97] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_120942__959.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4432 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231226_231900__844 | 3 | 0.0 | 21.7417 | 2 | [193, 218] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_231900__844.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4433 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231226_121137__421 | 0 | 0.0 | 15.8534 | 0 | [379, 248] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_121137__421.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4434 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231226_121156__395 | 0 | 0.0 | 19.5119 | 0 | [379, 312] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_121156__395.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4435 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231226_232000__947 | 0 | 0.0 | 15.7192 | 2 | [379, 252] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_232000__947.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4436 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 3 | 20231226_121102__195 | 0 | 0.0 | 17.7423 | 2 | [377, 280] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_121102__195.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4437 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 3 | 20231226_121120__470 | 0 | 0.0 | 17.9931 | 2 | [377, 285] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_121120__470.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4438 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 3 | 20231226_231944__277 | 3 | 0.0 | 19.9442 | 2 | [377, 329] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_231944__277.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4439 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231227_110712__574 | 3 | 0.0 | 44.5699 | 2 | [78, 261] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_110712__574.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4440 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231227_110744__172 | 3 | 0.0 | 31.821 | 2 | [78, 183] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_110744__172.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4441 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231227_110829__966 | 3 | 0.0 | 44.8916 | 2 | [78, 263] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_110829__966.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4442 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | false | 3 | 20231227_144830__692 | 0 | 0.0 | 53.9822 | 0 | [78, 317] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_144830__692.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4443 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231227_144902__360 | 3 | 0.0 | 31.2832 | 2 | [78, 179] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_144902__360.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4444 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_110601__700 | 3 | 0.0 | 13.857 | 2 | [117, 67] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_110601__700.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4445 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_110615__449 | 3 | 0.0 | 13.7058 | 2 | [117, 66] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_110615__449.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4446 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231227_110627__472 | 0 | 0.0 | 12.4108 | 0 | [117, 58] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_110627__472.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4447 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_144721__984 | 3 | 0.0 | 20.4395 | 2 | [117, 107] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_144721__984.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4448 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_144736__227 | 3 | 0.0 | 14.7458 | 2 | [117, 72] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_144736__227.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4449 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_110434__998 | 3 | 0.0 | 58.605 | 2 | [197, 294] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_110434__998.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4450 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_110507__541 | 3 | 0.0 | 33.206 | 2 | [197, 171] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_110507__541.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4451 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231227_110547__982 | 0 | 0.0 | 40.3086 | 0 | [197, 214] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_110547__982.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4452 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_144552__781 | 3 | 0.0 | 35.6767 | 2 | [197, 185] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_144552__781.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4453 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_144701__969 | 3 | 0.0 | 69.1044 | 2 | [197, 385] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_144701__969.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4454 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_111137__175 | 0 | 0.0 | 55.0954 | 2 | [391, 268] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_111137__175.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4455 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_111214__138 | 3 | 0.0 | 36.7047 | 2 | [391, 159] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_111214__138.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4456 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_111302__445 | 3 | 0.0 | 48.6615 | 2 | [391, 230] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_111302__445.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4457 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_145152__115 | 3 | 0.0 | 61.1757 | 2 | [391, 302] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_145152__115.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4458 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_145237__811 | 3 | 0.0 | 44.3237 | 2 | [391, 203] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_145237__811.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4459 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_110900__608 | 3 | 0.0 | 31.1723 | 2 | [389, 126] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_110900__608.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4460 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_110943__911 | 3 | 0.0 | 43.4157 | 2 | [389, 199] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_110943__911.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4461 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_111042__420 | 0 | 0.0 | 58.2972 | 0 | [389, 287] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_111042__420.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4462 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 3 | 20231227_144946__136 | 0 | 0.0 | 44.8223 | 0 | [389, 206] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_144946__136.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4463 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_145051__592 | 3 | 0.0 | 64.3833 | 2 | [389, 321] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_145051__592.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4464 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 3 | 20231219_225350__447 | 0 | 0.0 | 6.79356 | 0 | [1, 218] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231219_225350__447.json | 0.0 | missing | missing | missing | |
| 4465 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 3 | 20231225_024250__224 | 0 | 0.0 | 11.0048 | 0 | [76, 274] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231225_024250__224.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4466 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 3 | 20231225_024255__381 | 0 | 0.0 | 4.83479 | 0 | [76, 114] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231225_024255__381.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4467 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 3 | 20231225_164657__160 | 0 | 0.0 | 11.3921 | 0 | [76, 285] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231225_164657__160.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4468 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 3 | 20231225_164703__424 | 0 | 0.0 | 6.31042 | 0 | [76, 153] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231225_164703__424.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4469 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231225_024232__888 | 0 | 0.0 | 8.32703 | 2 | [79, 205] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_024232__888.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4470 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231225_024239__649 | 3 | 0.0 | 6.7152 | 2 | [79, 163] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_024239__649.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4471 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231225_164638__305 | 0 | 0.0 | 7.07638 | 2 | [79, 173] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_164638__305.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4472 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231225_164646__250 | 0 | 0.0 | 7.66105 | 2 | [79, 188] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_164646__250.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4473 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | false | 3 | 20231226_231707__192 | 0 | 0.0 | 6.83955 | 0 | [79, 166] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231226_231707__192.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4474 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_024220__523 | 0 | 0.0 | 3.50493 | 2 | [121, 74] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_024220__523.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4475 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_024224__444 | 0 | 0.0 | 3.50741 | 2 | [121, 74] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_024224__444.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4476 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_164628__524 | 3 | 0.0 | 3.15992 | 2 | [121, 65] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_164628__524.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4477 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_164631__174 | 3 | 0.0 | 3.15353 | 2 | [121, 65] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_164631__174.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4478 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231226_231700__238 | 3 | 0.0 | 6.55295 | 2 | [121, 153] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_231700__238.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4479 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231225_024208__996 | 0 | 0.0 | 18.4875 | 0 | [202, 283] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_024208__996.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4480 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_024217__293 | 0 | 0.0 | 8.54066 | 0 | [202, 191] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_024217__293.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4481 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_164613__722 | 3 | 0.0 | 17.8829 | 2 | [202, 264] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_164613__722.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4482 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_164624__878 | 0 | 0.0 | 10.77 | 2 | [202, 249] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_164624__878.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4483 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231226_231653__746 | 3 | 0.0 | 17.9349 | 2 | [202, 279] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_231653__746.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4484 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_024336__582 | 0 | 0.0 | 12.5092 | 2 | [388, 262] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_024336__582.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4485 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_024348__646 | 0 | 0.0 | 12.2442 | 2 | [388, 255] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_024348__646.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4486 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_164736__811 | 3 | 0.0 | 10.3924 | 2 | [388, 210] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_164736__811.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4487 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231225_164748__278 | 0 | 0.0 | 11.5615 | 0 | [388, 239] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_164748__278.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4488 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231226_231730__434 | 3 | 0.0 | 11.8748 | 2 | [388, 246] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_231730__434.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4489 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_024305__258 | 0 | 0.0 | 9.87868 | 2 | [386, 196] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_024305__258.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4490 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_024323__968 | 0 | 0.0 | 18.3992 | 2 | [386, 407] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_024323__968.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4491 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_164715__232 | 0 | 0.0 | 11.5693 | 2 | [386, 239] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_164715__232.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4492 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_164726__886 | 3 | 0.0 | 10.4577 | 2 | [386, 211] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_164726__886.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4493 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231226_231718__282 | 0 | 0.0 | 11.6793 | 2 | [386, 241] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_231718__282.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4494 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 3 | 20231214_001555__178 | 0 | 0.0 | 7.99051 | 0 | [59, 242] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231214_001555__178.json | 0.0 | missing | missing | missing | |
| 4495 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 3 | 20231225_020213__109 | 0 | 0.0 | 5.84241 | 0 | [75, 182] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231225_020213__109.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4496 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 3 | 20231225_020225__720 | 0 | 0.0 | 11.2104 | 0 | [75, 360] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231225_020225__720.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4497 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 3 | 20231225_160753__683 | 0 | 0.0 | 7.34798 | 0 | [75, 235] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231225_160753__683.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4498 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 3 | 20231225_160800__202 | 0 | 0.0 | 6.33786 | 0 | [75, 200] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231225_160800__202.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4499 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | InJulia | 1SHOT | true | false | 3 | 20231225_020201__969 | 0 | 0.0 | 11.8559 | 0 | [77, 379] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_020201__969.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4500 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 3 | 20231225_020208__244 | 0 | 0.0 | 6.09286 | 2 | [77, 191] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_020208__244.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4501 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 3 | 20231225_160738__788 | 0 | 0.0 | 10.4223 | 0 | [77, 337] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_160738__788.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4502 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 3 | 20231225_160746__611 | 0 | 0.0 | 7.58797 | 2 | [77, 242] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_160746__611.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4503 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 3 | 20231226_225903__197 | 0 | 0.0 | 9.45056 | 2 | [77, 303] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231226_225903__197.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4504 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_020140__962 | 0 | 0.0 | 9.10963 | 0 | [119, 284] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_020140__962.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4505 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_020149__735 | 3 | 0.0 | 9.80019 | 2 | [119, 307] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_020149__735.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4506 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_160721__524 | 3 | 0.0 | 10.2945 | 2 | [119, 326] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_160721__524.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4507 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_160728__810 | 0 | 0.0 | 6.59648 | 0 | [119, 203] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_160728__810.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4508 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231226_225854__264 | 0 | 0.0 | 3.63372 | 2 | [119, 103] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231226_225854__264.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4509 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_020115__956 | 3 | 0.0 | 14.2158 | 2 | [200, 259] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_020115__956.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4510 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_020130__662 | 3 | 0.0 | 14.9413 | 2 | [200, 454] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_020130__662.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4511 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_160700__635 | 3 | 0.0 | 15.7963 | 2 | [200, 319] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_160700__635.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4512 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_160711__521 | 0 | 0.0 | 10.4282 | 0 | [200, 313] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_160711__521.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4513 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231226_225850__255 | 3 | 0.0 | 12.4874 | 2 | [200, 218] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231226_225850__255.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4514 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_020245__425 | 0 | 0.0 | 2.49209 | 0 | [386, 21] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_020245__425.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4515 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_020247__470 | 0 | 0.0 | 2.36847 | 0 | [386, 17] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_020247__470.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4516 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_160834__630 | 3 | 0.0 | 11.3783 | 2 | [386, 307] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_160834__630.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4517 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_160836__770 | 0 | 0.0 | 2.64802 | 0 | [386, 26] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_160836__770.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4518 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231226_225916__988 | 0 | 0.0 | 4.10495 | 0 | [386, 74] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231226_225916__988.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4519 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_020235__905 | 0 | 0.0 | 10.4227 | 2 | [384, 280] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_020235__905.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4520 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_020242__909 | 0 | 0.0 | 7.02635 | 2 | [384, 173] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_020242__909.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4521 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_160811__852 | 0 | 0.0 | 10.9544 | 2 | [384, 300] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_160811__852.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4522 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_160822__851 | 3 | 0.0 | 11.5297 | 2 | [384, 318] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_160822__851.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4523 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 3 | 20231226_225912__645 | 0 | 0.0 | 8.71486 | 2 | [384, 227] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231226_225912__645.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4524 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | AsIs | 1SHOT | false | false | 3 | 20231214_002429__567 | 0 | 0.0 | 8.03434 | 0 | [59, 243] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__AsIs__1SHOT__20231214_002429__567.json | 0.0 | missing | missing | missing | |
| 4525 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | AsIs | 1SHOT | false | false | 3 | 20231225_022147__761 | 0 | 0.0 | 6.59715 | 0 | [76, 111] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__AsIs__1SHOT__20231225_022147__761.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4526 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | AsIs | 1SHOT | false | false | 3 | 20231225_022155__284 | 0 | 0.0 | 8.16782 | 0 | [76, 141] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__AsIs__1SHOT__20231225_022155__284.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4527 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | AsIs | 1SHOT | false | false | 3 | 20231225_162542__870 | 0 | 0.0 | 8.38032 | 0 | [76, 146] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__AsIs__1SHOT__20231225_162542__870.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4528 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | AsIs | 1SHOT | false | false | 3 | 20231225_162552__371 | 0 | 0.0 | 9.77045 | 0 | [76, 173] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__AsIs__1SHOT__20231225_162552__371.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4529 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | InJulia | 1SHOT | true | true | 3 | 20231225_022130__265 | 3 | 0.0 | 7.39047 | 2 | [78, 126] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__InJulia__1SHOT__20231225_022130__265.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4530 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | InJulia | 1SHOT | true | true | 3 | 20231225_022140__453 | 3 | 0.0 | 10.0571 | 2 | [78, 177] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__InJulia__1SHOT__20231225_022140__453.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4531 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | InJulia | 1SHOT | true | true | 3 | 20231225_162524__948 | 0 | 0.0 | 5.51937 | 2 | [78, 91] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__InJulia__1SHOT__20231225_162524__948.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4532 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | InJulia | 1SHOT | true | true | 3 | 20231225_162534__510 | 3 | 0.0 | 9.38136 | 2 | [78, 165] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__InJulia__1SHOT__20231225_162534__510.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4533 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | InJulia | 1SHOT | true | false | 3 | 20231226_230705__476 | 0 | 0.0 | 12.7967 | 0 | [78, 229] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__InJulia__1SHOT__20231226_230705__476.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4534 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_022118__755 | 0 | 0.0 | 4.16654 | 0 | [118, 60] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_022118__755.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4535 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_022122__839 | 0 | 0.0 | 4.17664 | 0 | [118, 60] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_022122__839.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4536 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_162515__378 | 0 | 0.0 | 3.9497 | 0 | [118, 56] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_162515__378.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4537 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_162519__626 | 0 | 0.0 | 4.14889 | 0 | [118, 60] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_162519__626.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4538 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231226_230652__964 | 0 | 0.0 | 4.16354 | 0 | [118, 60] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231226_230652__964.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4539 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_022109__508 | 3 | 0.0 | 30.2381 | 2 | [200, 349] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_022109__508.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4540 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_022114__114 | 0 | 0.0 | 5.18613 | 0 | [200, 64] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_022114__114.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4541 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_162504__282 | 0 | 0.0 | 20.5389 | 2 | [200, 178] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_162504__282.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4542 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_162511__768 | 0 | 0.0 | 6.29566 | 0 | [200, 86] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_162511__768.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4543 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231226_230648__346 | 0 | 0.0 | 15.6908 | 0 | [200, 94] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231226_230648__346.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4544 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_022251__567 | 0 | 0.0 | 25.3848 | 0 | [382, 400] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_022251__567.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4545 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_022311__414 | 0 | 0.0 | 19.6931 | 0 | [382, 300] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_022311__414.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4546 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_162631__151 | 0 | 0.0 | 6.74475 | 0 | [382, 68] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_162631__151.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4547 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_162717__248 | 0 | 0.0 | 45.8984 | 0 | [382, 751] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_162717__248.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4548 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231226_230748__726 | 0 | 0.0 | 28.7804 | 0 | [382, 460] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231226_230748__726.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4549 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_022210__247 | 0 | 0.0 | 14.8225 | 0 | [379, 214] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_022210__247.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4550 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_022226__523 | 3 | 0.0 | 16.2649 | 2 | [379, 240] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_022226__523.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4551 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaRecapTask | 1SHOT | true | false | 3 | 20231225_162606__887 | 0 | 0.0 | 13.7023 | 0 | [379, 195] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_162606__887.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4552 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_162624__593 | 0 | 0.0 | 18.2922 | 2 | [379, 278] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_162624__593.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4553 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 3 | 20231226_230719__906 | 0 | 0.0 | 14.2139 | 0 | [379, 204] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231226_230719__906.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4554 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 3 | 20231219_225644__355 | 0 | 0.0 | 5.93799 | 0 | [1, 191] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231219_225644__355.json | 0.0 | missing | missing | missing | |
| 4555 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 3 | 20231225_024452__844 | 0 | 0.0 | 6.25299 | 0 | [67, 241] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231225_024452__844.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4556 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 3 | 20231225_024509__590 | 0 | 0.0 | 16.7505 | 0 | [67, 637] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231225_024509__590.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4557 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 3 | 20231225_164910__203 | 0 | 0.0 | 19.6784 | 0 | [67, 743] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231225_164910__203.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4558 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 3 | 20231225_164935__419 | 0 | 0.0 | 25.3039 | 0 | [67, 939] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231225_164935__419.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4559 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 3 | 20231225_024422__584 | 0 | 0.0 | 2.14733 | 0 | [70, 77] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_024422__584.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4560 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 3 | 20231225_024446__712 | 0 | 0.0 | 24.144 | 0 | [70, 895] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_024446__712.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4561 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 3 | 20231225_164834__842 | 0 | 0.0 | 1.91842 | 0 | [70, 68] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_164834__842.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4562 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 3 | 20231225_164850__589 | 0 | 0.0 | 16.0543 | 0 | [70, 612] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_164850__589.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4563 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 3 | 20231226_231758__701 | 0 | 0.0 | 1.77632 | 0 | [70, 62] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231226_231758__701.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4564 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_024419__793 | 0 | 0.0 | 22.8901 | 0 | [107, 843] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_024419__793.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4565 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_024420__330 | 0 | 0.0 | 0.432599 | 0 | [107, 4] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_024420__330.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4566 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_164815__675 | 0 | 0.0 | 19.9099 | 0 | [107, 741] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_164815__675.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4567 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_164832__840 | 0 | 0.0 | 17.168 | 0 | [107, 645] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_164832__840.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4568 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231226_231756__172 | 0 | 0.0 | 20.5682 | 0 | [107, 761] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_231756__172.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4569 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_024354__530 | 0 | 0.0 | 5.9611 | 0 | [187, 69] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_024354__530.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4570 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_024356__878 | 0 | 0.0 | 2.17668 | 0 | [187, 65] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_024356__878.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4571 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_164753__170 | 0 | 0.0 | 4.77524 | 0 | [187, 22] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_164753__170.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4572 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_164755__653 | 0 | 0.0 | 2.50078 | 0 | [187, 78] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_164755__653.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4573 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231226_231736__383 | 0 | 0.0 | 5.38656 | 0 | [187, 55] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_231736__383.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4574 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_024526__548 | 0 | 0.0 | 7.35115 | 0 | [359, 233] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_024526__548.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4575 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_024528__535 | 0 | 0.0 | 1.30283 | 0 | [359, 4] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_024528__535.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4576 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_164953__847 | 0 | 0.0 | 1.33173 | 0 | [359, 5] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_164953__847.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4577 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_164956__539 | 0 | 0.0 | 2.65685 | 0 | [359, 57] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_164956__539.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4578 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231226_231838__649 | 0 | 0.0 | 9.52533 | 0 | [359, 312] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_231838__649.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4579 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_024515__747 | 0 | 0.0 | 5.33047 | 0 | [356, 158] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_024515__747.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4580 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_024519__787 | 0 | 0.0 | 4.25717 | 0 | [356, 118] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_024519__787.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4581 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_164936__689 | 0 | 0.0 | 1.22583 | 0 | [356, 1] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_164936__689.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4582 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_164952__408 | 0 | 0.0 | 15.6023 | 0 | [356, 530] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_164952__408.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4583 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 3 | 20231226_231829__157 | 0 | 0.0 | 30.8199 | 0 | [356, 1028] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_231829__157.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4584 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 3 | 20231214_002539__551 | 0 | 0.0 | 7.88952 | 0 | [59, 239] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231214_002539__551.json | 0.0 | missing | missing | missing | |
| 4585 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 3 | 20231225_022629__256 | 0 | 0.0 | 25.1775 | 0 | [84, 189] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_022629__256.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4586 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | AsIs | 1SHOT | true | true | 3 | 20231225_022648__289 | 3 | 0.0 | 19.4647 | 2 | [84, 143] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_022648__289.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4587 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | AsIs | 1SHOT | true | true | 3 | 20231225_163104__543 | 3 | 0.0 | 17.5313 | 2 | [84, 128] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_163104__543.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4588 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 3 | 20231225_163132__294 | 0 | 0.0 | 27.7395 | 0 | [84, 211] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_163132__294.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4589 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 3 | 20231225_022542__396 | 3 | 0.0 | 34.4357 | 2 | [86, 264] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_022542__396.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4590 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 3 | 20231225_022603__902 | 3 | 0.0 | 21.2343 | 2 | [86, 157] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_022603__902.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4591 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 3 | 20231225_163015__458 | 3 | 0.0 | 35.9551 | 2 | [86, 278] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_163015__458.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4592 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 3 | 20231225_163047__577 | 3 | 0.0 | 31.0651 | 2 | [86, 238] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_163047__577.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4593 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 3 | 20231226_230952__444 | 3 | 0.0 | 31.7118 | 2 | [86, 241] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231226_230952__444.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4594 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_022441__240 | 3 | 0.0 | 29.0499 | 2 | [126, 215] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_022441__240.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4595 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_022508__848 | 3 | 0.0 | 26.9198 | 2 | [126, 198] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_022508__848.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4596 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_162919__482 | 3 | 0.0 | 29.3778 | 2 | [126, 219] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_162919__482.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4597 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_162939__489 | 3 | 0.0 | 20.6128 | 2 | [126, 148] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_162939__489.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4598 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231226_230921__250 | 3 | 0.0 | 26.0644 | 2 | [126, 191] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231226_230921__250.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4599 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_022354__939 | 3 | 0.0 | 43.3092 | 2 | [208, 131] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_022354__939.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4600 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_022411__904 | 3 | 0.0 | 17.0455 | 2 | [208, 102] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_022411__904.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4601 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_162819__150 | 3 | 0.0 | 62.264 | 2 | [208, 291] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_162819__150.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4602 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_162849__217 | 3 | 0.0 | 30.2435 | 2 | [208, 209] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_162849__217.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4603 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231226_230854__296 | 3 | 0.0 | 66.8553 | 2 | [208, 333] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_230854__296.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4604 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_022854__555 | 3 | 0.0 | 54.9123 | 2 | [390, 365] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_022854__555.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4605 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_022919__446 | 3 | 0.0 | 24.9724 | 2 | [390, 132] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_022919__446.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4606 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_163311__646 | 3 | 0.0 | 21.0563 | 2 | [390, 102] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_163311__646.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4607 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_163342__913 | 3 | 0.0 | 30.4332 | 2 | [390, 176] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_163342__913.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4608 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231226_231055__310 | 3 | 0.0 | 20.9444 | 2 | [390, 101] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_231055__310.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4609 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_022724__796 | 3 | 0.0 | 35.3862 | 2 | [387, 214] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_022724__796.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4610 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_022759__420 | 3 | 0.0 | 35.7649 | 2 | [387, 217] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_022759__420.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4611 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | false | 3 | 20231225_163203__373 | 0 | 0.0 | 31.2006 | 0 | [387, 181] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_163203__373.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4612 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_163250__197 | 3 | 0.0 | 46.9267 | 2 | [387, 305] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_163250__197.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4613 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 3 | 20231226_231034__659 | 3 | 0.0 | 41.2017 | 2 | [387, 260] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231226_231034__659.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4614 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 3 | 20231219_224654__577 | 0 | 0.0 | 8.33685 | 0 | [1, 265] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231219_224654__577.json | 0.0 | missing | missing | missing | |
| 4615 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 3 | 20231225_023906__406 | 0 | 0.0 | 7.34922 | 0 | [77, 117] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231225_023906__406.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4616 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 3 | 20231225_023916__310 | 0 | 0.0 | 10.1062 | 0 | [77, 166] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231225_023916__310.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4617 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 3 | 20231225_164319__432 | 0 | 0.0 | 13.8013 | 0 | [77, 231] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231225_164319__432.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4618 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 3 | 20231225_164336__227 | 0 | 0.0 | 16.9366 | 0 | [77, 286] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231225_164336__227.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4619 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231225_023840__685 | 0 | 0.0 | 9.30966 | 2 | [79, 152] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_023840__685.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4620 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231225_023858__816 | 0 | 0.0 | 17.7414 | 2 | [79, 299] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_023858__816.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4621 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231225_164249__121 | 0 | 0.0 | 9.87758 | 2 | [79, 162] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_164249__121.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4622 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231225_164305__736 | 0 | 0.0 | 15.7425 | 2 | [79, 265] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_164305__736.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4623 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | false | false | 3 | 20231226_231518__292 | 0 | 0.0 | 10.7384 | 0 | [79, 177] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231226_231518__292.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4624 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_023823__381 | 0 | 0.0 | 10.6394 | 0 | [121, 170] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_023823__381.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4625 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_023831__277 | 0 | 0.0 | 8.48093 | 0 | [121, 132] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_023831__277.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4626 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_164230__796 | 3 | 0.0 | 14.9636 | 2 | [121, 246] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_164230__796.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4627 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_164239__500 | 0 | 0.0 | 9.39463 | 2 | [121, 148] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_164239__500.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4628 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231226_231507__970 | 0 | 0.0 | 4.84049 | 0 | [121, 68] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_231507__970.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4629 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_023757__768 | 3 | 0.0 | 22.3564 | 2 | [202, 195] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_023757__768.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4630 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231225_023812__313 | 0 | 0.0 | 14.6249 | 0 | [202, 225] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_023812__313.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4631 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_164200__595 | 0 | 0.0 | 24.4762 | 0 | [202, 240] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_164200__595.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4632 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_164215__145 | 0 | 0.0 | 14.9403 | 2 | [202, 230] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_164215__145.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4633 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231226_231502__690 | 3 | 0.0 | 23.2629 | 2 | [202, 226] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_231502__690.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4634 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_024001__713 | 0 | 0.0 | 15.0078 | 0 | [388, 202] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_024001__713.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4635 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_024015__612 | 0 | 0.0 | 13.9952 | 0 | [388, 185] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_024015__612.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4636 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_164434__886 | 0 | 0.0 | 22.1133 | 2 | [388, 322] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_164434__886.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4637 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_164451__213 | 0 | 0.0 | 16.2074 | 0 | [388, 223] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_164451__213.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4638 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231226_231546__486 | 0 | 0.0 | 14.1104 | 0 | [388, 187] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_231546__486.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4639 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_023930__524 | 0 | 0.0 | 13.7603 | 0 | [386, 181] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_023930__524.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4640 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_023946__748 | 3 | 0.0 | 16.2422 | 2 | [386, 223] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_023946__748.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4641 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_164352__905 | 0 | 0.0 | 15.9337 | 2 | [386, 218] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_164352__905.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4642 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_164412__343 | 0 | 0.0 | 20.4831 | 2 | [386, 295] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_164412__343.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4643 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231226_231532__220 | 0 | 0.0 | 14.3949 | 2 | [386, 192] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_231532__220.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4644 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | AsIs | 1SHOT | false | false | 3 | 20231214_002315__475 | 0 | 0.0 | 10.4351 | 0 | [59, 316] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__AsIs__1SHOT__20231214_002315__475.json | 0.0 | missing | missing | missing | |
| 4645 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | AsIs | 1SHOT | false | false | 3 | 20231225_022009__179 | 0 | 0.0 | 3.04334 | 0 | [78, 171] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_022009__179.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4646 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | AsIs | 1SHOT | false | false | 3 | 20231225_022011__273 | 0 | 0.0 | 2.12544 | 0 | [78, 116] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_022011__273.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4647 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | AsIs | 1SHOT | false | false | 3 | 20231225_162415__348 | 0 | 0.0 | 3.17062 | 0 | [78, 179] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_162415__348.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4648 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | AsIs | 1SHOT | false | false | 3 | 20231225_162419__734 | 0 | 0.0 | 3.36824 | 0 | [78, 191] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_162419__734.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4649 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | InJulia | 1SHOT | false | false | 3 | 20231225_022003__933 | 0 | 0.0 | 6.59664 | 0 | [81, 372] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_022003__933.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4650 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | InJulia | 1SHOT | false | false | 3 | 20231225_022006__203 | 0 | 0.0 | 2.69217 | 0 | [81, 150] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_022006__203.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4651 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | InJulia | 1SHOT | false | false | 3 | 20231225_162409__914 | 0 | 0.0 | 6.22203 | 0 | [81, 354] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_162409__914.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4652 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | InJulia | 1SHOT | false | false | 3 | 20231225_162412__578 | 0 | 0.0 | 3.41342 | 0 | [81, 193] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_162412__578.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4653 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | InJulia | 1SHOT | false | false | 3 | 20231226_230620__807 | 0 | 0.0 | 6.31514 | 0 | [81, 357] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__InJulia__1SHOT__20231226_230620__807.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4654 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_021956__324 | 0 | 0.0 | 1.56328 | 0 | [118, 77] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_021956__324.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4655 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_021957__936 | 0 | 0.0 | 1.15211 | 0 | [118, 51] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_021957__936.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4656 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_162400__939 | 0 | 0.0 | 1.28942 | 0 | [118, 61] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_162400__939.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4657 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_162403__414 | 0 | 0.0 | 2.94006 | 0 | [118, 159] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_162403__414.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4658 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231226_230614__227 | 0 | 0.0 | 2.54152 | 0 | [118, 135] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231226_230614__227.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4659 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_021949__957 | 0 | 0.0 | 6.3683 | 0 | [196, 168] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_021949__957.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4660 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_021954__339 | 0 | 0.0 | 4.78461 | 0 | [196, 244] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_021954__339.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4661 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_162352__230 | 0 | 0.0 | 7.15043 | 0 | [196, 232] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_162352__230.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4662 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_162358__455 | 0 | 0.0 | 6.7291 | 0 | [196, 353] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_162358__455.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4663 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231226_230611__187 | 0 | 0.0 | 6.87398 | 0 | [196, 209] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231226_230611__187.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4664 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_022034__609 | 0 | 0.0 | 8.89758 | 0 | [368, 416] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_022034__609.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4665 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_022039__485 | 0 | 0.0 | 4.96616 | 0 | [368, 213] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_022039__485.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4666 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_162438__694 | 0 | 0.0 | 7.30046 | 0 | [368, 336] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_162438__694.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4667 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_162444__455 | 0 | 0.0 | 5.0431 | 0 | [368, 219] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_162444__455.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4668 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231226_230632__102 | 0 | 0.0 | 5.93817 | 0 | [368, 265] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231226_230632__102.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4669 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_022019__559 | 0 | 0.0 | 8.04899 | 0 | [366, 373] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_022019__559.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4670 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_022025__992 | 0 | 0.0 | 5.23332 | 0 | [366, 228] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_022025__992.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4671 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_162426__819 | 0 | 0.0 | 7.0332 | 0 | [366, 324] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_162426__819.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4672 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_162431__512 | 0 | 0.0 | 5.42125 | 0 | [366, 240] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_162431__512.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4673 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 3 | 20231226_230626__375 | 0 | 0.0 | 5.89345 | 0 | [366, 263] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231226_230626__375.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4674 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | AsIs | 1SHOT | false | false | 3 | 20231214_001702__757 | 0 | 0.0 | 8.15276 | 0 | [59, 246] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__AsIs__1SHOT__20231214_001702__757.json | 0.0 | missing | missing | missing | |
| 4675 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | AsIs | 1SHOT | false | false | 3 | 20231225_020349__559 | 0 | 0.0 | 8.62315 | 0 | [76, 274] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__AsIs__1SHOT__20231225_020349__559.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4676 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | AsIs | 1SHOT | false | false | 3 | 20231225_020356__341 | 0 | 0.0 | 7.38818 | 0 | [76, 233] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__AsIs__1SHOT__20231225_020356__341.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4677 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | AsIs | 1SHOT | false | false | 3 | 20231225_160941__661 | 0 | 0.0 | 10.9352 | 0 | [76, 353] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__AsIs__1SHOT__20231225_160941__661.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4678 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | AsIs | 1SHOT | false | false | 3 | 20231225_160946__575 | 0 | 0.0 | 4.91258 | 0 | [76, 152] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__AsIs__1SHOT__20231225_160946__575.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4679 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | InJulia | 1SHOT | true | true | 3 | 20231225_020334__104 | 0 | 0.0 | 4.29972 | 2 | [79, 130] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_020334__104.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4680 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | InJulia | 1SHOT | true | true | 3 | 20231225_020340__936 | 3 | 0.0 | 6.43484 | 2 | [79, 201] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_020340__936.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4681 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | InJulia | 1SHOT | true | true | 3 | 20231225_160920__411 | 0 | 0.0 | 6.7432 | 2 | [79, 214] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_160920__411.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4682 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | InJulia | 1SHOT | true | true | 3 | 20231225_160930__989 | 0 | 0.0 | 9.07642 | 2 | [79, 290] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_160930__989.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4683 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | InJulia | 1SHOT | true | true | 3 | 20231226_225942__956 | 3 | 0.0 | 6.70632 | 2 | [79, 211] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__InJulia__1SHOT__20231226_225942__956.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4684 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_020323__338 | 3 | 0.0 | 5.40949 | 2 | [121, 162] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_020323__338.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4685 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_020329__491 | 3 | 0.0 | 6.40385 | 2 | [121, 195] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_020329__491.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4686 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_160909__210 | 0 | 0.0 | 5.9557 | 2 | [121, 181] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_160909__210.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4687 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_160914__275 | 3 | 0.0 | 4.66073 | 2 | [121, 138] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_160914__275.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4688 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231226_225935__944 | 3 | 0.0 | 5.12813 | 2 | [121, 153] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231226_225935__944.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4689 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_020304__632 | 0 | 0.0 | 16.7576 | 2 | [202, 335] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_020304__632.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4690 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_020317__771 | 0 | 0.0 | 13.1862 | 2 | [202, 397] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_020317__771.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4691 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231225_160855__215 | 0 | 0.0 | 18.2151 | 0 | [202, 391] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_160855__215.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4692 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_160903__178 | 3 | 0.0 | 8.37673 | 2 | [202, 246] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_160903__178.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4693 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231226_225930__374 | 3 | 0.0 | 14.179 | 2 | [202, 266] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231226_225930__374.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4694 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_020434__469 | 0 | 0.0 | 11.0287 | 2 | [388, 294] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_020434__469.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4695 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_020450__741 | 3 | 0.0 | 16.0486 | 2 | [388, 450] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_020450__741.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4696 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_161017__782 | 3 | 0.0 | 11.0697 | 2 | [388, 298] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_161017__782.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4697 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_161025__473 | 3 | 0.0 | 8.42716 | 2 | [388, 214] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_161025__473.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4698 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231226_225959__995 | 0 | 0.0 | 7.0304 | 2 | [388, 168] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231226_225959__995.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4699 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_020409__911 | 0 | 0.0 | 12.7691 | 2 | [386, 349] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_020409__911.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4700 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_020423__768 | 0 | 0.0 | 14.2025 | 2 | [386, 393] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_020423__768.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4701 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_160954__621 | 0 | 0.0 | 8.58721 | 2 | [386, 219] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_160954__621.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4702 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_161006__813 | 0 | 0.0 | 11.1601 | 2 | [386, 300] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_161006__813.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4703 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 3 | 20231226_225952__157 | 3 | 0.0 | 9.8437 | 2 | [386, 258] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231226_225952__157.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4704 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | AsIs | 1SHOT | false | false | 3 | 20231214_001812__930 | 0 | 0.0 | 7.71103 | 0 | [59, 233] | 0.10.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__AsIs__1SHOT__20231214_001812__930.json | 0.0 | missing | missing | missing | |
| 4705 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | AsIs | 1SHOT | false | false | 3 | 20231225_020835__392 | 0 | 0.0 | 54.8916 | 0 | [75, 412] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__AsIs__1SHOT__20231225_020835__392.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4706 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | AsIs | 1SHOT | false | false | 3 | 20231225_020949__181 | 0 | 0.0 | 74.6732 | 0 | [75, 561] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__AsIs__1SHOT__20231225_020949__181.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4707 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | AsIs | 1SHOT | false | false | 3 | 20231225_161451__639 | 0 | 0.0 | 51.8057 | 0 | [75, 391] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__AsIs__1SHOT__20231225_161451__639.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4708 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | AsIs | 1SHOT | false | false | 3 | 20231225_161529__750 | 0 | 0.0 | 37.4282 | 0 | [75, 269] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__AsIs__1SHOT__20231225_161529__750.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4709 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | InJulia | 1SHOT | true | true | 3 | 20231225_020704__517 | 0 | 0.0 | 37.6375 | 2 | [78, 280] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_020704__517.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4710 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | InJulia | 1SHOT | false | false | 3 | 20231225_020740__602 | 0 | 0.0 | 33.9229 | 0 | [78, 247] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_020740__602.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4711 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | InJulia | 1SHOT | true | true | 3 | 20231225_161311__672 | 3 | 0.0 | 51.4477 | 2 | [78, 389] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_161311__672.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4712 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | InJulia | 1SHOT | false | false | 3 | 20231225_161359__465 | 0 | 0.0 | 48.1692 | 0 | [78, 364] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_161359__465.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4713 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | InJulia | 1SHOT | true | false | 3 | 20231226_230151__459 | 0 | 0.0 | 57.1522 | 0 | [78, 431] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__InJulia__1SHOT__20231226_230151__459.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4714 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | false | 3 | 20231225_020608__770 | 0 | 0.0 | 18.2034 | 0 | [117, 123] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_020608__770.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4715 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_020626__690 | 3 | 0.0 | 17.9365 | 2 | [117, 121] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_020626__690.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4716 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_161204__199 | 3 | 0.0 | 18.4852 | 2 | [117, 126] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_161204__199.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4717 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_161220__638 | 3 | 0.0 | 15.1574 | 2 | [117, 100] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_161220__638.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4718 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231226_230054__318 | 3 | 0.0 | 15.7049 | 2 | [117, 104] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231226_230054__318.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4719 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_020533__123 | 3 | 0.0 | 42.7078 | 2 | [197, 109] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_020533__123.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4720 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231225_020550__465 | 0 | 0.0 | 16.8175 | 0 | [197, 96] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_020550__465.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4721 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_161110__430 | 3 | 0.0 | 44.903 | 2 | [197, 132] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_161110__430.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4722 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231225_161146__709 | 0 | 0.0 | 35.6732 | 0 | [197, 244] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_161146__709.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4723 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231226_230038__139 | 3 | 0.0 | 38.8809 | 2 | [197, 94] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231226_230038__139.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4724 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_021155__534 | 3 | 0.0 | 45.5882 | 2 | [391, 280] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_021155__534.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4725 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_021232__522 | 3 | 0.0 | 36.9779 | 2 | [391, 216] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_021232__522.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4726 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_161735__667 | 3 | 0.0 | 17.7326 | 2 | [391, 71] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_161735__667.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4727 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_161848__450 | 3 | 0.0 | 72.5264 | 2 | [391, 477] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_161848__450.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4728 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231226_230358__479 | 3 | 0.0 | 37.5351 | 2 | [391, 221] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231226_230358__479.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4729 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_021039__279 | 3 | 0.0 | 49.6154 | 2 | [389, 310] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_021039__279.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4730 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_021110__797 | 3 | 0.0 | 30.7391 | 2 | [389, 169] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_021110__797.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4731 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_161653__128 | 3 | 0.0 | 83.9396 | 2 | [389, 556] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_161653__128.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4732 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_161718__765 | 3 | 0.0 | 24.7817 | 2 | [389, 126] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_161718__765.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4733 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 3 | 20231226_230320__474 | 3 | 0.0 | 88.8242 | 2 | [389, 597] | 0.10.0-DEV | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231226_230320__474.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4734 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231214_003202__920 | 0 | 0.0 | 8.44943 | 0 | [62, 255] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231214_003202__920.json | 0.0 | missing | missing | missing | |
| 4735 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231225_025938__168 | 0 | 0.0 | 13.3394 | 0 | [84, 239] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_025938__168.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4736 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231225_025954__763 | 0 | 0.0 | 15.8253 | 0 | [84, 286] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_025954__763.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4737 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | InJulia | 1SHOT | false | false | 5 | 20231214_003153__443 | 0 | 0.0 | 11.1914 | 0 | [79, 332] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231214_003153__443.json | 0.0 | missing | missing | missing | |
| 4738 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_025911__613 | 0 | 0.0 | 6.00718 | 0 | [87, 100] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_025911__613.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4739 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_025925__987 | 0 | 0.0 | 13.8236 | 0 | [87, 248] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_025925__987.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4740 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231226_232707__988 | 2 | 0.0 | 9.90014 | 3 | [87, 175] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231226_232707__988.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4741 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_003142__573 | 0 | 0.0 | 4.8966 | 0 | [108, 131] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231214_003142__573.json | 50.0 | missing | missing | missing | |
| 4742 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_025902__391 | 0 | 0.0 | 2.80473 | 0 | [125, 34] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_025902__391.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4743 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_025905__648 | 3 | 0.0 | 3.11903 | 2 | [125, 40] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_025905__648.json | 81.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4744 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_232657__131 | 0 | 0.0 | 2.84971 | 0 | [125, 35] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231226_232657__131.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4745 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_003137__836 | 0 | 0.0 | 15.2471 | 0 | [184, 412] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231214_003137__836.json | 0.0 | missing | missing | missing | |
| 4746 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_025855__556 | 0 | 0.0 | 23.4761 | 0 | [202, 220] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_025855__556.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4747 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_025859__127 | 0 | 0.0 | 3.41157 | 0 | [202, 31] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_025859__127.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4748 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_232654__488 | 0 | 0.0 | 21.9048 | 0 | [202, 200] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231226_232654__488.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4749 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_003231__883 | 0 | 0.0 | 14.4842 | 0 | [11, 398] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231214_003231__883.json | 50.0 | missing | missing | missing | |
| 4750 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_030040__384 | 0 | 0.0 | 13.0418 | 0 | [390, 178] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_030040__384.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4751 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_030100__356 | 0 | 0.0 | 19.7673 | 0 | [390, 298] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_030100__356.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4752 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_232736__878 | 5 | 0.0 | 15.1889 | 3 | [390, 217] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231226_232736__878.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4753 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_003216__775 | 0 | 0.0 | 14.4319 | 0 | [379, 312] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231214_003216__775.json | 50.0 | missing | missing | missing | |
| 4754 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_030008__594 | 5 | 0.0 | 14.5009 | 3 | [387, 204] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_030008__594.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4755 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_030027__676 | 0 | 0.0 | 18.7219 | 0 | [387, 280] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_030027__676.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4756 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_232721__459 | 5 | 0.0 | 13.8281 | 3 | [387, 193] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231226_232721__459.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4757 | NVIDIA-RTX-4090-4x | clean_column | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_000504__169 | 0 | 0.0 | 1.79895 | 0 | [0, 137] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_000504__169.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4758 | NVIDIA-RTX-4090-4x | clean_column | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_000506__314 | 0 | 0.0 | 1.85398 | 0 | [0, 141] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_000506__314.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4759 | NVIDIA-RTX-4090-4x | clean_column | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_000508__376 | 0 | 0.0 | 2.3247 | 0 | [0, 177] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_000508__376.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4760 | NVIDIA-RTX-4090-4x | clean_column | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_000510__357 | 0 | 0.0 | 2.12791 | 0 | [0, 162] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_000510__357.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4761 | NVIDIA-RTX-4090-4x | clean_column | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_000513__634 | 0 | 0.0 | 2.67062 | 0 | [0, 203] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_000513__634.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4762 | NVIDIA-RTX-4090-4x | clean_column | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_000446__465 | 2 | 0.0 | 1.6363 | 3 | [0, 123] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_000446__465.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4763 | NVIDIA-RTX-4090-4x | clean_column | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_000448__389 | 0 | 0.0 | 0.491966 | 0 | [0, 37] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_000448__389.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4764 | NVIDIA-RTX-4090-4x | clean_column | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_000448__896 | 2 | 0.0 | 1.21078 | 3 | [0, 92] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_000448__896.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4765 | NVIDIA-RTX-4090-4x | clean_column | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_000449__546 | 0 | 0.0 | 0.519794 | 0 | [0, 37] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_000449__546.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4766 | NVIDIA-RTX-4090-4x | clean_column | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_000449__846 | 4 | 0.0 | 0.587779 | 2 | [0, 43] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_000449__846.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4767 | NVIDIA-RTX-4090-4x | clean_column | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_000439__964 | 4 | 0.0 | 4.42432 | 2 | [0, 327] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_000439__964.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4768 | NVIDIA-RTX-4090-4x | clean_column | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_000440__277 | 0 | 0.0 | 0.726197 | 0 | [0, 53] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_000440__277.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4769 | NVIDIA-RTX-4090-4x | clean_column | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_000440__761 | 0 | 0.0 | 0.504418 | 0 | [0, 37] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_000440__761.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4770 | NVIDIA-RTX-4090-4x | clean_column | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_000441__465 | 0 | 0.0 | 0.720709 | 0 | [0, 53] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_000441__465.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4771 | NVIDIA-RTX-4090-4x | clean_column | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_000442__898 | 0 | 0.0 | 0.753167 | 0 | [0, 55] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_000442__898.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4772 | NVIDIA-RTX-4090-4x | clean_column | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_000601__426 | 0 | 0.0 | 2.40806 | 0 | [0, 178] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_000601__426.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4773 | NVIDIA-RTX-4090-4x | clean_column | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_000603__141 | 0 | 0.0 | 1.80725 | 0 | [0, 134] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_000603__141.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4774 | NVIDIA-RTX-4090-4x | clean_column | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_000610__612 | 0 | 0.0 | 6.81367 | 0 | [0, 497] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_000610__612.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4775 | NVIDIA-RTX-4090-4x | clean_column | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_000614__279 | 5 | 0.0 | 4.16185 | 3 | [0, 306] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_000614__279.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4776 | NVIDIA-RTX-4090-4x | clean_column | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_000615__757 | 0 | 0.0 | 1.29459 | 0 | [0, 96] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_000615__757.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4777 | NVIDIA-RTX-4090-4x | clean_column | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_000534__650 | 0 | 0.0 | 3.18931 | 0 | [0, 235] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_000534__650.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4778 | NVIDIA-RTX-4090-4x | clean_column | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_000537__215 | 0 | 0.0 | 2.15224 | 0 | [0, 159] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_000537__215.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4779 | NVIDIA-RTX-4090-4x | clean_column | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_000538__517 | 0 | 0.0 | 1.63557 | 0 | [0, 121] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_000538__517.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4780 | NVIDIA-RTX-4090-4x | clean_column | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_000542__213 | 0 | 0.0 | 3.3579 | 0 | [0, 247] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_000542__213.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4781 | NVIDIA-RTX-4090-4x | clean_column | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_000546__422 | 0 | 0.0 | 3.78245 | 0 | [0, 279] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_000546__422.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4782 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231214_003310__779 | 0 | 0.0 | 6.40778 | 0 | [62, 193] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__AsIs__1SHOT__20231214_003310__779.json | 0.0 | missing | missing | missing | |
| 4783 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231225_030141__428 | 0 | 0.0 | 1.63105 | 0 | [58, 21] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__AsIs__1SHOT__20231225_030141__428.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4784 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231225_030143__780 | 0 | 0.0 | 1.67494 | 0 | [58, 22] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__AsIs__1SHOT__20231225_030143__780.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4785 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231214_003303__220 | 0 | 0.0 | 13.963 | 0 | [79, 413] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__InJulia__1SHOT__20231214_003303__220.json | 0.0 | missing | missing | missing | |
| 4786 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_030135__931 | 0 | 0.0 | 4.12322 | 0 | [61, 69] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_030135__931.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4787 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_030139__702 | 0 | 0.0 | 3.9206 | 0 | [61, 65] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_030139__702.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4788 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_003249__958 | 0 | 0.0 | 5.44791 | 0 | [108, 148] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231214_003249__958.json | 0.0 | missing | missing | missing | |
| 4789 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_030121__760 | 0 | 0.0 | 4.43439 | 0 | [62, 75] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_030121__760.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4790 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_030131__181 | 0 | 0.0 | 10.715 | 0 | [62, 195] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_030131__181.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4791 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_003244__654 | 0 | 0.0 | 12.7306 | 0 | [184, 341] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231214_003244__654.json | 50.0 | missing | missing | missing | |
| 4792 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_030113__966 | 0 | 0.0 | 12.9126 | 0 | [77, 41] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_030113__966.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4793 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_030116__990 | 0 | 0.0 | 2.87617 | 0 | [77, 40] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_030116__990.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4794 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_003339__899 | 0 | 0.0 | 11.6174 | 0 | [11, 322] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231214_003339__899.json | 0.0 | missing | missing | missing | |
| 4795 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_030159__284 | 0 | 0.0 | 0.857751 | 0 | [79, 1] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_030159__284.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4796 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_030200__755 | 0 | 0.0 | 1.31922 | 0 | [79, 10] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_030200__755.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4797 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_003327__699 | 0 | 0.0 | 17.7037 | 0 | [379, 399] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231214_003327__699.json | 0.0 | missing | missing | missing | |
| 4798 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_030145__254 | 0 | 0.0 | 1.8931 | 0 | [76, 21] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_030145__254.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4799 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_030158__553 | 0 | 0.0 | 13.1411 | 0 | [76, 236] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_030158__553.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4800 | NVIDIA-RTX-4090-4x | clean_column | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_000942__652 | 0 | 0.0 | 8.31355 | 0 | [0, 300] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_000942__652.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4801 | NVIDIA-RTX-4090-4x | clean_column | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_000950__931 | 0 | 0.0 | 7.22672 | 0 | [0, 261] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_000950__931.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4802 | NVIDIA-RTX-4090-4x | clean_column | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_000951__230 | 0 | 0.0 | 1.49096 | 0 | [0, 54] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_000951__230.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4803 | NVIDIA-RTX-4090-4x | clean_column | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_000955__610 | 0 | 0.0 | 3.64671 | 0 | [0, 132] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_000955__610.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4804 | NVIDIA-RTX-4090-4x | clean_column | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_001002__514 | 0 | 0.0 | 6.95694 | 0 | [0, 251] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_001002__514.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4805 | NVIDIA-RTX-4090-4x | clean_column | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_000844__348 | 0 | 0.0 | 0.815979 | 0 | [0, 29] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_000844__348.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4806 | NVIDIA-RTX-4090-4x | clean_column | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_000849__540 | 4 | 0.0 | 5.25626 | 2 | [0, 187] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_000849__540.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4807 | NVIDIA-RTX-4090-4x | clean_column | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_000852__308 | 0 | 0.0 | 2.91126 | 0 | [0, 104] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_000852__308.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4808 | NVIDIA-RTX-4090-4x | clean_column | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_000857__707 | 0 | 0.0 | 4.76487 | 0 | [0, 170] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_000857__707.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4809 | NVIDIA-RTX-4090-4x | clean_column | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_000903__951 | 0 | 0.0 | 5.78901 | 0 | [0, 206] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_000903__951.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4810 | NVIDIA-RTX-4090-4x | clean_column | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_000814__488 | 0 | 0.0 | 1.43352 | 0 | [0, 51] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_000814__488.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4811 | NVIDIA-RTX-4090-4x | clean_column | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_000819__506 | 0 | 0.0 | 4.98221 | 0 | [0, 176] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_000819__506.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4812 | NVIDIA-RTX-4090-4x | clean_column | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_000820__320 | 0 | 0.0 | 1.41712 | 0 | [0, 51] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_000820__320.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4813 | NVIDIA-RTX-4090-4x | clean_column | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_000823__696 | 0 | 0.0 | 2.46646 | 2 | [0, 89] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_000823__696.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4814 | NVIDIA-RTX-4090-4x | clean_column | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_000827__615 | 0 | 0.0 | 4.13751 | 0 | [0, 149] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_000827__615.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4815 | NVIDIA-RTX-4090-4x | clean_column | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_001112__951 | 5 | 0.0 | 11.0264 | 3 | [0, 392] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_001112__951.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4816 | NVIDIA-RTX-4090-4x | clean_column | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_001121__565 | 0 | 0.0 | 8.19961 | 0 | [0, 292] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_001121__565.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4817 | NVIDIA-RTX-4090-4x | clean_column | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_001128__961 | 0 | 0.0 | 7.01434 | 0 | [0, 250] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_001128__961.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4818 | NVIDIA-RTX-4090-4x | clean_column | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_001131__509 | 0 | 0.0 | 2.73443 | 0 | [0, 98] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_001131__509.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4819 | NVIDIA-RTX-4090-4x | clean_column | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_001142__156 | 0 | 0.0 | 11.235 | 0 | [0, 399] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_001142__156.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4820 | NVIDIA-RTX-4090-4x | clean_column | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_001015__945 | 0 | 0.0 | 1.42337 | 0 | [0, 51] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_001015__945.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4821 | NVIDIA-RTX-4090-4x | clean_column | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_001018__630 | 0 | 0.0 | 2.87226 | 0 | [0, 103] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_001018__630.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4822 | NVIDIA-RTX-4090-4x | clean_column | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_001019__700 | 0 | 0.0 | 1.17357 | 0 | [0, 42] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_001019__700.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4823 | NVIDIA-RTX-4090-4x | clean_column | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_001034__696 | 0 | 0.0 | 14.9213 | 0 | [0, 529] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_001034__696.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4824 | NVIDIA-RTX-4090-4x | clean_column | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_001036__517 | 0 | 0.0 | 1.42457 | 0 | [0, 51] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_001036__517.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4825 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 5 | 20240201_000015__564 | 4 | 0.0 | 8.04415 | 2 | [0, 198] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_000015__564.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4826 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 5 | 20240201_000029__722 | 4 | 0.0 | 14.0586 | 2 | [0, 346] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_000029__722.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4827 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 5 | 20240201_000037__357 | 4 | 0.0 | 7.69634 | 2 | [0, 190] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_000037__357.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4828 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 5 | 20240201_000049__756 | 5 | 0.0 | 11.9824 | 3 | [0, 295] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_000049__756.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4829 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 5 | 20240201_000058__279 | 0 | 0.0 | 9.12518 | 0 | [0, 225] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_000058__279.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4830 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_235853__458 | 5 | 0.0 | 4.65551 | 3 | [0, 115] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_235853__458.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4831 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_235900__663 | 4 | 0.0 | 6.64806 | 2 | [0, 164] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_235900__663.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4832 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_235908__944 | 0 | 0.0 | 7.99474 | 0 | [0, 197] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_235908__944.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4833 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_235915__649 | 0 | 0.0 | 6.89861 | 0 | [0, 170] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_235915__649.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4834 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_235923__352 | 0 | 0.0 | 7.79456 | 0 | [0, 192] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240131_235923__352.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4835 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_235730__262 | 0 | 0.0 | 10.1632 | 0 | [0, 249] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_235730__262.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4836 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_235734__323 | 0 | 0.0 | 4.47333 | 0 | [0, 110] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_235734__323.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4837 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_235744__285 | 0 | 0.0 | 10.0674 | 0 | [0, 247] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_235744__285.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4838 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_235756__997 | 4 | 0.0 | 11.3664 | 2 | [0, 279] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_235756__997.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4839 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240131_235810__685 | 5 | 0.0 | 14.3402 | 3 | [0, 351] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240131_235810__685.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4840 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_000340__961 | 1 | 0.0 | 9.193 | 3 | [0, 223] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_000340__961.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4841 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_000347__188 | 0 | 0.0 | 7.15854 | 0 | [0, 174] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_000347__188.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4842 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_000403__709 | 4 | 0.0 | 15.713 | 2 | [0, 380] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_000403__709.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4843 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_000409__465 | 5 | 0.0 | 6.20467 | 3 | [0, 151] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_000409__465.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4844 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_000419__333 | 0 | 0.0 | 9.44534 | 0 | [0, 229] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_000419__333.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4845 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_000208__130 | 5 | 0.0 | 9.26397 | 3 | [0, 222] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_000208__130.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4846 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_000216__493 | 0 | 0.0 | 8.43633 | 0 | [0, 205] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_000216__493.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4847 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_000224__473 | 5 | 0.0 | 8.14143 | 3 | [0, 198] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_000224__473.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4848 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_000234__271 | 5 | 0.0 | 10.0079 | 3 | [0, 243] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_000234__271.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4849 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_000240__371 | 5 | 0.0 | 5.20481 | 3 | [0, 127] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_000240__371.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4850 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_235153__881 | 5 | 0.0 | 18.5689 | 3 | [0, 347] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_235153__881.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4851 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240131_235157__808 | 0 | 0.0 | 4.34761 | 0 | [0, 81] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_235157__808.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4852 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_235208__219 | 5 | 0.0 | 11.3777 | 3 | [0, 213] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_235208__219.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4853 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240131_235216__130 | 0 | 0.0 | 7.37518 | 0 | [0, 138] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_235216__130.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4854 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240131_235230__230 | 5 | 0.0 | 14.0044 | 3 | [0, 262] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240131_235230__230.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4855 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_235012__213 | 4 | 0.0 | 13.9839 | 2 | [0, 261] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_235012__213.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4856 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_235023__419 | 0 | 0.0 | 11.1468 | 0 | [0, 208] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_235023__419.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4857 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240131_235035__514 | 5 | 0.0 | 12.1456 | 3 | [0, 227] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_235035__514.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4858 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_235041__281 | 0 | 0.0 | 6.35707 | 0 | [0, 119] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_235041__281.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4859 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240131_235049__204 | 0 | 0.0 | 8.06403 | 0 | [0, 151] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240131_235049__204.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4860 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_234832__845 | 0 | 0.0 | 12.7107 | 0 | [0, 235] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_234832__845.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4861 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_234842__973 | 0 | 0.0 | 10.2707 | 0 | [0, 191] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_234842__973.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4862 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_234850__754 | 0 | 0.0 | 8.14294 | 0 | [0, 152] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_234850__754.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4863 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_234858__990 | 0 | 0.0 | 8.18981 | 0 | [0, 153] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_234858__990.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4864 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240131_234912__488 | 0 | 0.0 | 13.7973 | 0 | [0, 257] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240131_234912__488.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4865 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_235536__353 | 4 | 0.0 | 10.7591 | 2 | [0, 199] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_235536__353.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4866 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_235542__280 | 5 | 0.0 | 6.50722 | 3 | [0, 121] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_235542__280.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4867 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_235552__325 | 5 | 0.0 | 10.0776 | 3 | [0, 187] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_235552__325.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4868 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_235607__977 | 4 | 0.0 | 15.1574 | 2 | [0, 279] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_235607__977.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4869 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240131_235616__435 | 4 | 0.0 | 8.97226 | 2 | [0, 166] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240131_235616__435.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4870 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_235328__282 | 5 | 0.0 | 8.097 | 3 | [0, 150] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_235328__282.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4871 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_235341__690 | 5 | 0.0 | 12.9361 | 3 | [0, 239] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_235341__690.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4872 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_235358__264 | 5 | 0.0 | 17.2856 | 3 | [0, 319] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_235358__264.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4873 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_235406__549 | 4 | 0.0 | 7.78753 | 3 | [0, 144] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_235406__549.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4874 | NVIDIA-RTX-4090-4x | clean_column | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240131_235418__487 | 5 | 0.0 | 12.6573 | 3 | [0, 234] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240131_235418__487.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4875 | NVIDIA-RTX-4090-4x | clean_column | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_000643__867 | 0 | 0.0 | 1.79301 | 0 | [0, 215] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_000643__867.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4876 | NVIDIA-RTX-4090-4x | clean_column | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_000644__472 | 0 | 0.0 | 1.45634 | 0 | [0, 175] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_000644__472.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4877 | NVIDIA-RTX-4090-4x | clean_column | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_000646__532 | 0 | 0.0 | 1.46688 | 0 | [0, 176] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_000646__532.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4878 | NVIDIA-RTX-4090-4x | clean_column | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_000647__706 | 0 | 0.0 | 1.46649 | 0 | [0, 169] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_000647__706.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4879 | NVIDIA-RTX-4090-4x | clean_column | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_000648__934 | 0 | 0.0 | 0.265792 | 0 | [0, 30] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_000648__934.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4880 | NVIDIA-RTX-4090-4x | clean_column | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_000632__340 | 0 | 0.0 | 0.242633 | 0 | [0, 29] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_000632__340.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4881 | NVIDIA-RTX-4090-4x | clean_column | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_000632__372 | 0 | 0.0 | 0.244106 | 0 | [0, 29] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_000632__372.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4882 | NVIDIA-RTX-4090-4x | clean_column | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_000632__705 | 0 | 0.0 | 0.24242 | 0 | [0, 29] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_000632__705.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4883 | NVIDIA-RTX-4090-4x | clean_column | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_000632__736 | 0 | 0.0 | 0.241791 | 0 | [0, 29] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_000632__736.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4884 | NVIDIA-RTX-4090-4x | clean_column | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_000633__672 | 0 | 0.0 | 0.24263 | 0 | [0, 29] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_000633__672.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4885 | NVIDIA-RTX-4090-4x | clean_column | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_000625__846 | 0 | 0.0 | 1.09791 | 0 | [0, 131] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_000625__846.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4886 | NVIDIA-RTX-4090-4x | clean_column | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_000626__238 | 0 | 0.0 | 0.833401 | 0 | [0, 99] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_000626__238.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4887 | NVIDIA-RTX-4090-4x | clean_column | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_000626__805 | 0 | 0.0 | 0.749486 | 0 | [0, 90] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_000626__805.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4888 | NVIDIA-RTX-4090-4x | clean_column | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_000627__416 | 0 | 0.0 | 0.751444 | 0 | [0, 90] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_000627__416.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4889 | NVIDIA-RTX-4090-4x | clean_column | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_000628__385 | 0 | 0.0 | 0.41015 | 0 | [0, 49] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_000628__385.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4890 | NVIDIA-RTX-4090-4x | clean_column | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_000722__676 | 0 | 0.0 | 1.76402 | 0 | [0, 207] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_000722__676.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4891 | NVIDIA-RTX-4090-4x | clean_column | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_000724__761 | 0 | 0.0 | 1.91304 | 0 | [0, 224] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_000724__761.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4892 | NVIDIA-RTX-4090-4x | clean_column | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_000727__992 | 0 | 0.0 | 2.22453 | 0 | [0, 263] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_000727__992.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4893 | NVIDIA-RTX-4090-4x | clean_column | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_000730__647 | 0 | 0.0 | 2.79635 | 0 | [0, 327] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_000730__647.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4894 | NVIDIA-RTX-4090-4x | clean_column | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_000732__954 | 0 | 0.0 | 2.34217 | 0 | [0, 274] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_000732__954.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4895 | NVIDIA-RTX-4090-4x | clean_column | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_000703__195 | 0 | 0.0 | 5.05749 | 0 | [0, 569] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_000703__195.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4896 | NVIDIA-RTX-4090-4x | clean_column | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_000706__970 | 0 | 0.0 | 1.91984 | 0 | [0, 215] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_000706__970.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4897 | NVIDIA-RTX-4090-4x | clean_column | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_000708__499 | 0 | 0.0 | 2.78021 | 0 | [0, 309] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_000708__499.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4898 | NVIDIA-RTX-4090-4x | clean_column | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_000709__182 | 0 | 0.0 | 0.380593 | 0 | [0, 43] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_000709__182.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4899 | NVIDIA-RTX-4090-4x | clean_column | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_000709__369 | 0 | 0.0 | 0.380515 | 0 | [0, 43] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_000709__369.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4900 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_225959__937 | 0 | 0.0 | 6.11645 | 0 | [62, 184] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_225959__937.json | 0.0 | missing | missing | missing | |
| 4901 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_230008__681 | 0 | 0.0 | 9.3389 | 0 | [1, 295] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_230008__681.json | 0.0 | missing | missing | missing | |
| 4902 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_230014__460 | 0 | 0.0 | 5.74226 | 0 | [1, 185] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_230014__460.json | 0.0 | missing | missing | missing | |
| 4903 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_031525__558 | 0 | 0.0 | 27.5533 | 0 | [76, 160] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_031525__558.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4904 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_031549__297 | 0 | 0.0 | 23.6498 | 0 | [76, 136] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_031549__297.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4905 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231219_225943__734 | 0 | 0.0 | 11.6448 | 0 | [1, 363] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_225943__734.json | 50.0 | missing | missing | missing | |
| 4906 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_225952__962 | 0 | 0.0 | 9.13887 | 0 | [1, 289] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_225952__962.json | 0.0 | missing | missing | missing | |
| 4907 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_031420__693 | 5 | 0.0 | 43.0673 | 3 | [79, 257] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_031420__693.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4908 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_031457__433 | 5 | 0.0 | 37.0307 | 3 | [79, 219] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_031457__433.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4909 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231226_233403__341 | 4 | 0.0 | 23.7652 | 2 | [79, 136] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231226_233403__341.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4910 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_225908__983 | 0 | 0.0 | 10.3914 | 0 | [1, 322] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_225908__983.json | 50.0 | missing | missing | missing | |
| 4911 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_225915__613 | 0 | 0.0 | 7.62156 | 0 | [1, 240] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_225915__613.json | 50.0 | missing | missing | missing | |
| 4912 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_031323__460 | 0 | 0.0 | 32.6024 | 0 | [120, 185] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_031323__460.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4913 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_031337__777 | 0 | 0.0 | 13.9164 | 0 | [120, 69] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_031337__777.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4914 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_233339__198 | 0 | 0.0 | 44.582 | 0 | [120, 262] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_233339__198.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4915 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_225844__424 | 0 | 0.0 | 12.0772 | 0 | [1, 362] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_225844__424.json | 0.0 | missing | missing | missing | |
| 4916 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231219_225852__797 | 0 | 0.0 | 8.0521 | 0 | [1, 246] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_225852__797.json | 50.0 | missing | missing | missing | |
| 4917 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_031233__282 | 5 | 0.0 | 48.5371 | 3 | [196, 93] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_031233__282.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4918 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_031250__816 | 0 | 0.0 | 17.18 | 0 | [196, 75] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_031250__816.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4919 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_233255__961 | 0 | 0.0 | 49.8479 | 0 | [196, 135] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_233255__961.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4920 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_230142__408 | 0 | 0.0 | 16.7298 | 0 | [1, 464] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_230142__408.json | 50.0 | missing | missing | missing | |
| 4921 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_230159__191 | 0 | 0.0 | 17.1545 | 0 | [1, 475] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_230159__191.json | 50.0 | missing | missing | missing | |
| 4922 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_031754__470 | 4 | 0.0 | 38.3719 | 3 | [408, 174] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_031754__470.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4923 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_031825__421 | 5 | 0.0 | 30.825 | 3 | [408, 128] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_031825__421.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4924 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_233541__277 | 4 | 0.0 | 34.1599 | 2 | [408, 149] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_233541__277.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4925 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_230053__915 | 0 | 0.0 | 18.5775 | 0 | [1, 512] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_230053__915.json | 50.0 | missing | missing | missing | |
| 4926 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_230112__789 | 0 | 0.0 | 18.9801 | 0 | [1, 522] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_230112__789.json | 50.0 | missing | missing | missing | |
| 4927 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_031633__659 | 4 | 0.0 | 43.7354 | 3 | [406, 206] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_031633__659.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4928 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_031715__879 | 5 | 0.0 | 41.9202 | 3 | [406, 195] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_031715__879.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4929 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_233507__292 | 4 | 0.0 | 63.0082 | 3 | [406, 323] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_233507__292.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4930 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_234106__535 | 0 | 0.0 | 7.5776 | 0 | [78, 291] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231226_234106__535.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4931 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_111332__629 | 0 | 0.0 | 5.44829 | 0 | [78, 208] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_111332__629.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4932 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_111337__325 | 0 | 0.0 | 5.59057 | 0 | [78, 214] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_111337__325.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4933 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_111342__901 | 0 | 0.0 | 5.30303 | 0 | [78, 203] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_111342__901.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4934 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_234058__264 | 0 | 0.0 | 7.24833 | 0 | [115, 273] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_234058__264.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4935 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_111318__647 | 0 | 0.0 | 1.44452 | 0 | [115, 45] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_111318__647.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4936 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_111325__965 | 0 | 0.0 | 6.15728 | 0 | [115, 231] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_111325__965.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4937 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_111326__166 | 0 | 0.0 | 1.4134 | 0 | [115, 44] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_111326__166.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4938 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_234051__624 | 0 | 0.0 | 5.40831 | 0 | [189, 58] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_234051__624.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4939 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_111308__973 | 0 | 0.0 | 5.47714 | 0 | [189, 67] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_111308__973.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4940 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_111312__770 | 0 | 0.0 | 4.20516 | 0 | [189, 145] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_111312__770.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4941 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_111317__392 | 0 | 0.0 | 4.92551 | 0 | [189, 172] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_111317__392.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4942 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_234120__945 | 0 | 0.0 | 3.68542 | 0 | [367, 95] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_234120__945.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4943 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_111406__271 | 0 | 0.0 | 3.52426 | 0 | [367, 89] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_111406__271.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4944 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_111413__711 | 0 | 0.0 | 6.17745 | 0 | [367, 188] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_111413__711.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4945 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_111422__359 | 0 | 0.0 | 9.24837 | 0 | [367, 301] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_111422__359.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4946 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_234116__939 | 0 | 0.0 | 10.0807 | 0 | [364, 331] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_234116__939.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4947 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_111350__239 | 0 | 0.0 | 7.41423 | 0 | [364, 234] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_111350__239.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4948 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_111354__406 | 0 | 0.0 | 4.15247 | 0 | [364, 113] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_111354__406.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4949 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_111403__655 | 0 | 0.0 | 8.70999 | 0 | [364, 282] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_111403__655.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4950 | Apple-MacBook-Pro-M1 | clean_column | gemini-1.0-pro-latest | InJulia | 1SHOT | true | false | 5 | 20240217_105859__554 | 0 | 0.0 | 6.26018 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_105859__554.json | 25.0 | missing | missing | missing | |
| 4951 | Apple-MacBook-Pro-M1 | clean_column | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_105902__890 | 0 | 0.0 | 3.12896 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_105902__890.json | 50.0 | missing | missing | missing | |
| 4952 | Apple-MacBook-Pro-M1 | clean_column | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_105910__613 | 0 | 0.0 | 7.93103 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_105910__613.json | 50.0 | missing | missing | missing | |
| 4953 | Apple-MacBook-Pro-M1 | clean_column | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_105913__105 | 0 | 0.0 | 3.15227 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_105913__105.json | 50.0 | missing | missing | missing | |
| 4954 | Apple-MacBook-Pro-M1 | clean_column | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_105915__826 | 0 | 0.0 | 1.86852 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_105915__826.json | 50.0 | missing | missing | missing | |
| 4955 | Apple-MacBook-Pro-M1 | clean_column | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240217_105830__871 | 0 | 0.0 | 1.66919 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_105830__871.json | 0.0 | missing | missing | missing | |
| 4956 | Apple-MacBook-Pro-M1 | clean_column | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_105832__235 | 0 | 0.0 | 1.89538 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_105832__235.json | 50.0 | missing | missing | missing | |
| 4957 | Apple-MacBook-Pro-M1 | clean_column | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_105834__979 | 0 | 0.0 | 1.66095 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_105834__979.json | 50.0 | missing | missing | missing | |
| 4958 | Apple-MacBook-Pro-M1 | clean_column | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_105835__857 | 0 | 0.0 | 1.63219 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_105835__857.json | 50.0 | missing | missing | missing | |
| 4959 | Apple-MacBook-Pro-M1 | clean_column | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_105839__136 | 0 | 0.0 | 3.45037 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_105839__136.json | 50.0 | missing | missing | missing | |
| 4960 | Apple-MacBook-Pro-M1 | clean_column | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240217_105803__160 | 0 | 0.0 | 2.69049 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_105803__160.json | 50.0 | missing | missing | missing | |
| 4961 | Apple-MacBook-Pro-M1 | clean_column | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240217_105806__361 | 0 | 0.0 | 2.60273 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_105806__361.json | 50.0 | missing | missing | missing | |
| 4962 | Apple-MacBook-Pro-M1 | clean_column | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240217_105809__888 | 4 | 0.0 | 2.54655 | 2 | [0, 0] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_105809__888.json | 86.6667 | missing | missing | missing | |
| 4963 | Apple-MacBook-Pro-M1 | clean_column | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240217_105811__662 | 0 | 0.0 | 2.53787 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_105811__662.json | 50.0 | missing | missing | missing | |
| 4964 | Apple-MacBook-Pro-M1 | clean_column | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240217_105816__210 | 0 | 0.0 | 4.28507 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_105816__210.json | 50.0 | missing | missing | missing | |
| 4965 | Apple-MacBook-Pro-M1 | clean_column | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240217_110002__945 | 0 | 0.0 | 2.62919 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_110002__945.json | 25.0 | missing | missing | missing | |
| 4966 | Apple-MacBook-Pro-M1 | clean_column | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240217_110004__680 | 0 | 0.0 | 1.79111 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_110004__680.json | 25.0 | missing | missing | missing | |
| 4967 | Apple-MacBook-Pro-M1 | clean_column | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240217_110006__349 | 0 | 0.0 | 2.45548 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_110006__349.json | 0.0 | missing | missing | missing | |
| 4968 | Apple-MacBook-Pro-M1 | clean_column | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240217_110008__242 | 0 | 0.0 | 1.9329 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_110008__242.json | 50.0 | missing | missing | missing | |
| 4969 | Apple-MacBook-Pro-M1 | clean_column | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240217_110011__651 | 0 | 0.0 | 2.44496 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_110011__651.json | 50.0 | missing | missing | missing | |
| 4970 | Apple-MacBook-Pro-M1 | clean_column | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_105936__220 | 0 | 0.0 | 2.04549 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_105936__220.json | 50.0 | missing | missing | missing | |
| 4971 | Apple-MacBook-Pro-M1 | clean_column | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | false | 5 | 20240217_105939__276 | 0 | 0.0 | 2.33542 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_105939__276.json | 25.0 | missing | missing | missing | |
| 4972 | Apple-MacBook-Pro-M1 | clean_column | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_105940__341 | 4 | 0.0 | 1.78014 | 3 | [0, 0] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_105940__341.json | 95.0 | missing | missing | missing | |
| 4973 | Apple-MacBook-Pro-M1 | clean_column | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_105942__697 | 0 | 0.0 | 1.70644 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_105942__697.json | 50.0 | missing | missing | missing | |
| 4974 | Apple-MacBook-Pro-M1 | clean_column | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | false | 5 | 20240217_105944__553 | 0 | 0.0 | 1.9504 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_105944__553.json | 25.0 | missing | missing | missing | |
| 4975 | Apple-MacBook-Pro-M1 | clean_column | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | true | 5 | 20240223_224023__935 | 0 | 0.0 | 19.5224 | 0 | [0, 299] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_224023__935.json | 50.0 | missing | missing | missing | |
| 4976 | Apple-MacBook-Pro-M1 | clean_column | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | true | 5 | 20240223_224037__818 | 0 | 0.0 | 14.5982 | 0 | [0, 220] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_224037__818.json | 50.0 | missing | missing | missing | |
| 4977 | Apple-MacBook-Pro-M1 | clean_column | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | true | 5 | 20240223_224057__801 | 0 | 0.0 | 19.714 | 0 | [0, 302] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_224057__801.json | 50.0 | missing | missing | missing | |
| 4978 | Apple-MacBook-Pro-M1 | clean_column | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | true | 5 | 20240223_224116__819 | 0 | 0.0 | 19.2793 | 0 | [0, 296] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_224116__819.json | 50.0 | missing | missing | missing | |
| 4979 | Apple-MacBook-Pro-M1 | clean_column | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | true | 5 | 20240223_224139__443 | 0 | 0.0 | 22.2473 | 0 | [0, 335] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_224139__443.json | 50.0 | missing | missing | missing | |
| 4980 | Apple-MacBook-Pro-M1 | clean_column | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240223_223825__323 | 0 | 0.0 | 1.77938 | 0 | [0, 27] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_223825__323.json | 50.0 | missing | missing | missing | |
| 4981 | Apple-MacBook-Pro-M1 | clean_column | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240223_223826__853 | 0 | 0.0 | 1.70581 | 0 | [0, 25] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_223826__853.json | 0.0 | missing | missing | missing | |
| 4982 | Apple-MacBook-Pro-M1 | clean_column | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240223_223828__261 | 0 | 0.0 | 1.96544 | 0 | [0, 28] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_223828__261.json | 50.0 | missing | missing | missing | |
| 4983 | Apple-MacBook-Pro-M1 | clean_column | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240223_223830__230 | 0 | 0.0 | 2.09882 | 0 | [0, 31] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_223830__230.json | 0.0 | missing | missing | missing | |
| 4984 | Apple-MacBook-Pro-M1 | clean_column | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240223_223832__506 | 0 | 0.0 | 1.60867 | 0 | [0, 25] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_223832__506.json | 0.0 | missing | missing | missing | |
| 4985 | Apple-MacBook-Pro-M1 | clean_column | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240223_223637__224 | 0 | 0.0 | 23.119 | 0 | [0, 355] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_223637__224.json | 25.0 | missing | missing | missing | |
| 4986 | Apple-MacBook-Pro-M1 | clean_column | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240223_223657__343 | 0 | 0.0 | 19.3976 | 0 | [0, 298] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_223657__343.json | 25.0 | missing | missing | missing | |
| 4987 | Apple-MacBook-Pro-M1 | clean_column | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240223_223719__221 | 0 | 0.0 | 22.0589 | 0 | [0, 337] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_223719__221.json | 25.0 | missing | missing | missing | |
| 4988 | Apple-MacBook-Pro-M1 | clean_column | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240223_223741__819 | 0 | 0.0 | 22.3275 | 0 | [0, 337] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_223741__819.json | 25.0 | missing | missing | missing | |
| 4989 | Apple-MacBook-Pro-M1 | clean_column | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240223_223802__749 | 0 | 0.0 | 21.324 | 0 | [0, 325] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_223802__749.json | 25.0 | missing | missing | missing | |
| 4990 | Apple-MacBook-Pro-M1 | clean_column | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_224707__241 | 0 | 0.0 | 22.642 | 0 | [0, 345] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_224707__241.json | 50.0 | missing | missing | missing | |
| 4991 | Apple-MacBook-Pro-M1 | clean_column | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_224724__153 | 0 | 0.0 | 17.108 | 0 | [0, 262] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_224724__153.json | 50.0 | missing | missing | missing | |
| 4992 | Apple-MacBook-Pro-M1 | clean_column | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_224748__242 | 0 | 0.0 | 23.83 | 0 | [0, 363] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_224748__242.json | 50.0 | missing | missing | missing | |
| 4993 | Apple-MacBook-Pro-M1 | clean_column | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_224809__271 | 0 | 0.0 | 21.7406 | 0 | [0, 327] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_224809__271.json | 50.0 | missing | missing | missing | |
| 4994 | Apple-MacBook-Pro-M1 | clean_column | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_224828__764 | 0 | 0.0 | 18.3403 | 0 | [0, 279] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_224828__764.json | 50.0 | missing | missing | missing | |
| 4995 | Apple-MacBook-Pro-M1 | clean_column | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240223_224345__611 | 0 | 0.0 | 21.6164 | 0 | [0, 328] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_224345__611.json | 50.0 | missing | missing | missing | |
| 4996 | Apple-MacBook-Pro-M1 | clean_column | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240223_224405__392 | 0 | 0.0 | 20.796 | 0 | [0, 317] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_224405__392.json | 50.0 | missing | missing | missing | |
| 4997 | Apple-MacBook-Pro-M1 | clean_column | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240223_224425__429 | 0 | 0.0 | 19.9729 | 0 | [0, 301] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_224425__429.json | 0.0 | missing | missing | missing | |
| 4998 | Apple-MacBook-Pro-M1 | clean_column | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240223_224448__163 | 0 | 0.0 | 22.8285 | 0 | [0, 342] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_224448__163.json | 50.0 | missing | missing | missing | |
| 4999 | Apple-MacBook-Pro-M1 | clean_column | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240223_224506__464 | 0 | 0.0 | 17.9169 | 0 | [0, 272] | 0.13.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_224506__464.json | 50.0 | missing | missing | missing | |
| 5000 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 5 | 20231213_202043__783 | 0 | 0.0003195 | 5.48944 | 0 | [69, 190] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231213_202043__783.json | 0.0 | missing | missing | missing | |
| 5001 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 5 | 20231225_190744__676 | 0 | 0.000255 | 2.8065 | 0 | [69, 147] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_190744__676.json | 0.0 | missing | missing | missing | |
| 5002 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 5 | 20231225_190747__945 | 0 | 0.0003105 | 2.94562 | 0 | [69, 184] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_190747__945.json | 0.0 | missing | missing | missing | |
| 5003 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo--optim | AsIs | 1SHOT | false | false | 5 | 20231215_193143__313 | 0 | 0.0 | 3.64879 | 0 | [69, 153] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231215_193143__313.json | 0.0 | 0.5 | missing | 0.5 | |
| 5004 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231213_202038__497 | 5 | 0.000243 | 3.36794 | 3 | [72, 138] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231213_202038__497.json | 100.0 | missing | missing | missing | |
| 5005 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231225_190739__182 | 5 | 0.000303 | 3.03777 | 3 | [72, 178] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_190739__182.json | 100.0 | missing | missing | missing | |
| 5006 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231225_190741__953 | 5 | 0.000297 | 2.61548 | 3 | [72, 174] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_190741__953.json | 100.0 | missing | missing | missing | |
| 5007 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231227_194302__790 | 5 | 0.000252 | 2.74381 | 3 | [72, 144] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_194302__790.json | 100.0 | missing | missing | missing | |
| 5008 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231227_194305__626 | 5 | 0.0003405 | 3.33733 | 3 | [72, 203] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_194305__626.json | 100.0 | missing | missing | missing | |
| 5009 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo--optim | InJulia | 1SHOT | true | true | 5 | 20231215_193140__939 | 5 | 0.0 | 3.75006 | 3 | [72, 153] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231215_193140__939.json | 100.0 | 0.5 | missing | 0.5 | |
| 5010 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_202034__315 | 0 | 0.0001555 | 2.39058 | 0 | [107, 68] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231213_202034__315.json | 50.0 | missing | missing | missing | |
| 5011 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_190734__991 | 4 | 0.0001345 | 1.31337 | 2 | [107, 54] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_190734__991.json | 86.6667 | missing | missing | missing | |
| 5012 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_190736__633 | 0 | 0.000178 | 1.58401 | 0 | [107, 83] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_190736__633.json | 50.0 | missing | missing | missing | |
| 5013 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_194258__783 | 0 | 0.000121 | 1.01365 | 0 | [107, 45] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_194258__783.json | 50.0 | missing | missing | missing | |
| 5014 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_194259__916 | 5 | 0.0001075 | 1.14803 | 3 | [107, 36] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_194259__916.json | 100.0 | missing | missing | missing | |
| 5015 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_193136__194 | 5 | 0.0 | 1.4185 | 3 | [107, 44] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231215_193136__194.json | 100.0 | 0.5 | missing | 0.5 | |
| 5016 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_202032__361 | 0 | 0.0001645 | 1.48428 | 0 | [170, 53] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231213_202032__361.json | 0.0 | missing | missing | missing | |
| 5017 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_190732__348 | 0 | 0.0001375 | 1.02204 | 0 | [170, 35] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_190732__348.json | 0.0 | missing | missing | missing | |
| 5018 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_190733__920 | 0 | 0.0001585 | 0.940524 | 0 | [170, 49] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_190733__920.json | 0.0 | missing | missing | missing | |
| 5019 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_194255__543 | 0 | 0.0002425 | 1.99959 | 0 | [170, 105] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_194255__543.json | 0.0 | missing | missing | missing | |
| 5020 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_194257__207 | 0 | 0.000163 | 1.13259 | 0 | [170, 52] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_194257__207.json | 0.0 | missing | missing | missing | |
| 5021 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo--optim | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231215_193135__758 | 0 | 0.0 | 1.73821 | 0 | [170, 69] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231215_193135__758.json | 0.0 | 0.5 | missing | 0.5 | |
| 5022 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231213_202048__617 | 0 | 0.000282 | 2.26676 | 0 | [330, 78] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231213_202048__617.json | 0.0 | missing | missing | missing | |
| 5023 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_190753__959 | 0 | 0.000363 | 2.7989 | 0 | [330, 132] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_190753__959.json | 0.0 | missing | missing | missing | |
| 5024 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_190755__404 | 0 | 0.0003195 | 2.11774 | 0 | [330, 103] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_190755__404.json | 0.0 | missing | missing | missing | |
| 5025 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_194310__662 | 0 | 0.0002715 | 1.44965 | 0 | [330, 71] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_194310__662.json | 0.0 | missing | missing | missing | |
| 5026 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_194313__514 | 0 | 0.000378 | 2.88805 | 0 | [330, 142] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_194313__514.json | 50.0 | missing | missing | missing | |
| 5027 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo--optim | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231215_193147__395 | 0 | 0.0 | 1.75908 | 0 | [330, 73] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231215_193147__395.json | 0.0 | 0.5 | missing | 0.5 | |
| 5028 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_202046__619 | 0 | 0.000271 | 2.0821 | 0 | [329, 71] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231213_202046__619.json | 0.0 | missing | missing | missing | |
| 5029 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_190748__741 | 0 | 0.0002485 | 1.0993 | 0 | [329, 56] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_190748__741.json | 0.0 | missing | missing | missing | |
| 5030 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_190750__233 | 0 | 0.000256 | 1.46256 | 0 | [329, 61] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_190750__233.json | 0.0 | missing | missing | missing | |
| 5031 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_194307__758 | 0 | 0.0002695 | 1.60047 | 0 | [329, 70] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_194307__758.json | 0.0 | missing | missing | missing | |
| 5032 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_194308__892 | 0 | 0.0002755 | 1.65571 | 0 | [329, 74] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_194308__892.json | 0.0 | missing | missing | missing | |
| 5033 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo--optim | JuliaRecapTask | 1SHOT | false | false | 5 | 20231215_193145__474 | 0 | 0.0 | 1.97639 | 0 | [329, 74] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231215_193145__474.json | 0.0 | 0.5 | missing | 0.5 | |
| 5034 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200339__881 | 4 | 0.000198 | 0.940621 | 2 | [72, 108] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200339__881.json | 86.6667 | missing | missing | missing | |
| 5035 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | false | 5 | 20240201_200340__156 | 0 | 0.0002385 | 1.04265 | 0 | [72, 135] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200340__156.json | 25.0 | missing | missing | missing | |
| 5036 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200341__544 | 4 | 0.0002055 | 0.991482 | 2 | [72, 113] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200341__544.json | 86.6667 | missing | missing | missing | |
| 5037 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200342__688 | 4 | 0.000192 | 0.866882 | 2 | [72, 104] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200342__688.json | 86.6667 | missing | missing | missing | |
| 5038 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200343__207 | 4 | 0.00021 | 1.00223 | 2 | [72, 116] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200343__207.json | 86.6667 | missing | missing | missing | |
| 5039 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200336__309 | 3 | 0.000109 | 0.528777 | 2 | [107, 37] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200336__309.json | 81.6667 | missing | missing | missing | |
| 5040 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200336__964 | 4 | 0.000106 | 0.727525 | 2 | [107, 35] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200336__964.json | 86.6667 | missing | missing | missing | |
| 5041 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200337__367 | 4 | 0.000106 | 0.466545 | 2 | [107, 35] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200337__367.json | 86.6667 | missing | missing | missing | |
| 5042 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200337__654 | 4 | 0.0001105 | 0.737586 | 2 | [107, 38] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200337__654.json | 86.6667 | missing | missing | missing | |
| 5043 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200338__237 | 4 | 9.1e-5 | 0.663671 | 2 | [107, 25] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200338__237.json | 86.6667 | missing | missing | missing | |
| 5044 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_200332__291 | 0 | 0.0001915 | 0.877948 | 0 | [170, 71] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200332__291.json | 0.0 | missing | missing | missing | |
| 5045 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_200332__585 | 0 | 0.000166 | 0.565806 | 0 | [170, 54] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200332__585.json | 0.0 | missing | missing | missing | |
| 5046 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200333__217 | 4 | 0.000187 | 0.991937 | 2 | [170, 68] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200333__217.json | 86.6667 | missing | missing | missing | |
| 5047 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200334__199 | 4 | 0.0001435 | 0.699345 | 2 | [170, 39] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200334__199.json | 86.6667 | missing | missing | missing | |
| 5048 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_200335__828 | 0 | 0.000169 | 0.786283 | 0 | [170, 56] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200335__828.json | 0.0 | missing | missing | missing | |
| 5049 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_200349__889 | 4 | 0.0003345 | 1.15164 | 3 | [330, 113] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200349__889.json | 95.0 | missing | missing | missing | |
| 5050 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_200350__110 | 5 | 0.0002925 | 0.87877 | 3 | [330, 85] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200350__110.json | 100.0 | missing | missing | missing | |
| 5051 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200351__619 | 0 | 0.0002565 | 0.818587 | 0 | [330, 61] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200351__619.json | 0.0 | missing | missing | missing | |
| 5052 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_200352__632 | 5 | 0.000396 | 1.39511 | 3 | [330, 154] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200352__632.json | 100.0 | missing | missing | missing | |
| 5053 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200354__196 | 0 | 0.000369 | 1.29473 | 0 | [330, 136] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200354__196.json | 0.0 | missing | missing | missing | |
| 5054 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200344__245 | 0 | 0.000319 | 1.08148 | 0 | [329, 103] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200344__245.json | 50.0 | missing | missing | missing | |
| 5055 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200345__915 | 0 | 0.000358 | 1.03144 | 0 | [329, 129] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200345__915.json | 50.0 | missing | missing | missing | |
| 5056 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200346__373 | 5 | 0.000223 | 0.500827 | 3 | [329, 39] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200346__373.json | 100.0 | missing | missing | missing | |
| 5057 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200347__724 | 0 | 0.0004015 | 1.26816 | 0 | [329, 158] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200347__724.json | 50.0 | missing | missing | missing | |
| 5058 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200348__934 | 0 | 0.0002215 | 0.693928 | 0 | [329, 38] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200348__934.json | 50.0 | missing | missing | missing | |
| 5059 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 5 | 20231213_202100__674 | 0 | 0.000377 | 4.20056 | 0 | [69, 154] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231213_202100__674.json | 0.0 | missing | missing | missing | |
| 5060 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 5 | 20231225_190805__212 | 0 | 0.000407 | 3.0346 | 0 | [69, 169] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_190805__212.json | 0.0 | missing | missing | missing | |
| 5061 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 5 | 20231225_190807__577 | 0 | 0.000285 | 1.63614 | 0 | [69, 108] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_190807__577.json | 0.0 | missing | missing | missing | |
| 5062 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106--optim | AsIs | 1SHOT | false | false | 5 | 20231215_193156__161 | 0 | 0.0 | 2.73865 | 0 | [69, 99] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231215_193156__161.json | 0.0 | 0.9 | missing | 0.1 | |
| 5063 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231213_202055__940 | 4 | 0.00039 | 3.39241 | 2 | [72, 159] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231213_202055__940.json | 86.6667 | missing | missing | missing | |
| 5064 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231225_190801__860 | 5 | 0.000252 | 1.64026 | 3 | [72, 90] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_190801__860.json | 100.0 | missing | missing | missing | |
| 5065 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231225_190802__425 | 4 | 0.000254 | 1.54545 | 2 | [72, 91] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_190802__425.json | 86.6667 | missing | missing | missing | |
| 5066 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231227_194321__849 | 3 | 0.000318 | 2.72622 | 2 | [72, 123] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_194321__849.json | 81.6667 | missing | missing | missing | |
| 5067 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231227_194324__678 | 4 | 0.000382 | 2.76232 | 2 | [72, 155] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_194324__678.json | 86.6667 | missing | missing | missing | |
| 5068 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106--optim | InJulia | 1SHOT | true | true | 5 | 20231215_193154__434 | 4 | 0.0 | 1.72957 | 2 | [72, 74] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231215_193154__434.json | 86.6667 | 0.9 | missing | 0.1 | |
| 5069 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_202052__820 | 4 | 0.000179 | 2.05826 | 2 | [107, 36] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231213_202052__820.json | 86.6667 | missing | missing | missing | |
| 5070 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_190758__612 | 4 | 0.000179 | 0.940001 | 2 | [107, 36] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_190758__612.json | 86.6667 | missing | missing | missing | |
| 5071 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_190759__996 | 4 | 0.000177 | 0.698347 | 2 | [107, 35] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_190759__996.json | 86.6667 | missing | missing | missing | |
| 5072 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_194317__110 | 4 | 0.000175 | 1.28367 | 2 | [107, 34] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_194317__110.json | 86.6667 | missing | missing | missing | |
| 5073 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_194318__811 | 5 | 0.000189 | 1.06031 | 3 | [107, 41] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_194318__811.json | 100.0 | missing | missing | missing | |
| 5074 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_193152__223 | 4 | 0.0 | 2.16681 | 2 | [107, 31] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231215_193152__223.json | 86.6667 | 0.9 | missing | 0.1 | |
| 5075 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_202049__196 | 5 | 0.000298 | 1.56064 | 3 | [170, 64] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231213_202049__196.json | 100.0 | missing | missing | missing | |
| 5076 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_190756__667 | 5 | 0.000308 | 1.07883 | 3 | [170, 69] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_190756__667.json | 100.0 | missing | missing | missing | |
| 5077 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_190757__917 | 5 | 0.000284 | 1.20553 | 3 | [170, 57] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_190757__917.json | 100.0 | missing | missing | missing | |
| 5078 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_194314__246 | 4 | 0.00027 | 1.07969 | 2 | [170, 50] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_194314__246.json | 86.6667 | missing | missing | missing | |
| 5079 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_194316__577 | 0 | 0.000274 | 2.00184 | 0 | [170, 52] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_194316__577.json | 25.0 | missing | missing | missing | |
| 5080 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_193150__229 | 4 | 0.0 | 2.50329 | 2 | [170, 50] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231215_193150__229.json | 86.6667 | 0.9 | missing | 0.1 | |
| 5081 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_202105__836 | 4 | 0.00066 | 3.3229 | 3 | [330, 165] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231213_202105__836.json | 95.0 | missing | missing | missing | |
| 5082 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_190811__989 | 0 | 0.000528 | 1.81898 | 0 | [330, 99] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_190811__989.json | 0.0 | missing | missing | missing | |
| 5083 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_190813__601 | 0 | 0.0005 | 1.42748 | 0 | [330, 85] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_190813__601.json | 0.0 | missing | missing | missing | |
| 5084 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_194328__888 | 0 | 0.000466 | 1.41699 | 0 | [330, 68] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_194328__888.json | 0.0 | missing | missing | missing | |
| 5085 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_194330__345 | 0 | 0.000628 | 2.47309 | 0 | [330, 149] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_194330__345.json | 0.0 | missing | missing | missing | |
| 5086 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106--optim | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231215_193159__397 | 0 | 0.0 | 1.31352 | 0 | [330, 58] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231215_193159__397.json | 0.0 | 0.9 | missing | 0.1 | |
| 5087 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_202102__207 | 5 | 0.000495 | 2.27026 | 3 | [329, 83] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231213_202102__207.json | 100.0 | missing | missing | missing | |
| 5088 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_190808__183 | 0 | 0.000415 | 1.11473 | 0 | [329, 43] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_190808__183.json | 50.0 | missing | missing | missing | |
| 5089 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_190809__829 | 5 | 0.000421 | 1.00046 | 3 | [329, 46] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_190809__829.json | 100.0 | missing | missing | missing | |
| 5090 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_194325__402 | 0 | 0.000421 | 1.07837 | 0 | [329, 46] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_194325__402.json | 50.0 | missing | missing | missing | |
| 5091 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_194326__391 | 0 | 0.000419 | 1.21289 | 0 | [329, 45] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_194326__391.json | 50.0 | missing | missing | missing | |
| 5092 | Apple-MacBook-Pro-M1 | clean_column | gpt-3.5-turbo-1106--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_193158__847 | 5 | 0.0 | 1.33947 | 3 | [329, 55] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231215_193158__847.json | 100.0 | 0.9 | missing | 0.1 | |
| 5093 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_084214__280 | 4 | 0.00834 | 17.1531 | 2 | [72, 254] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_084214__280.json | 86.6667 | missing | missing | missing | |
| 5094 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_084245__357 | 4 | 0.00915 | 30.5818 | 2 | [72, 281] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_084245__357.json | 86.6667 | missing | missing | missing | |
| 5095 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_084304__437 | 4 | 0.00885 | 19.6041 | 2 | [72, 271] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_084304__437.json | 86.6667 | missing | missing | missing | |
| 5096 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_084327__926 | 4 | 0.01134 | 22.5953 | 2 | [72, 354] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_084327__926.json | 86.6667 | missing | missing | missing | |
| 5097 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_084350__954 | 4 | 0.01119 | 23.2244 | 2 | [72, 349] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_084350__954.json | 86.6667 | missing | missing | missing | |
| 5098 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_083932__760 | 4 | 0.002 | 2.50134 | 2 | [107, 31] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_083932__760.json | 86.6667 | missing | missing | missing | |
| 5099 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_083936__308 | 4 | 0.0026 | 3.83408 | 2 | [107, 51] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_083936__308.json | 86.6667 | missing | missing | missing | |
| 5100 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_083939__578 | 4 | 0.00194 | 2.62264 | 2 | [107, 29] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_083939__578.json | 86.6667 | missing | missing | missing | |
| 5101 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_083941__759 | 4 | 0.002 | 2.47022 | 2 | [107, 31] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_083941__759.json | 86.6667 | missing | missing | missing | |
| 5102 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_083944__155 | 4 | 0.00212 | 2.72074 | 2 | [107, 35] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_083944__155.json | 86.6667 | missing | missing | missing | |
| 5103 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_083806__857 | 4 | 0.00965 | 24.6824 | 2 | [170, 265] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_083806__857.json | 86.6667 | missing | missing | missing | |
| 5104 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_083818__845 | 5 | 0.00707 | 12.0019 | 3 | [170, 179] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_083818__845.json | 100.0 | missing | missing | missing | |
| 5105 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_083839__463 | 4 | 0.00773 | 20.7867 | 2 | [170, 201] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_083839__463.json | 86.6667 | missing | missing | missing | |
| 5106 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_083849__441 | 4 | 0.00596 | 10.7265 | 2 | [170, 142] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_083849__441.json | 86.6667 | missing | missing | missing | |
| 5107 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_083915__335 | 4 | 0.01082 | 25.6201 | 2 | [170, 304] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_083915__335.json | 86.6667 | missing | missing | missing | |
| 5108 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_085035__702 | 5 | 0.00849 | 16.4297 | 3 | [330, 173] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_085035__702.json | 100.0 | missing | missing | missing | |
| 5109 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_085056__351 | 5 | 0.01299 | 21.1276 | 3 | [330, 323] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_085056__351.json | 100.0 | missing | missing | missing | |
| 5110 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_085116__176 | 5 | 0.01296 | 19.8288 | 3 | [330, 322] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_085116__176.json | 100.0 | missing | missing | missing | |
| 5111 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_085138__498 | 0 | 0.01266 | 21.2532 | 0 | [330, 312] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_085138__498.json | 0.0 | missing | missing | missing | |
| 5112 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_085159__994 | 5 | 0.00963 | 21.1859 | 3 | [330, 211] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_085159__994.json | 100.0 | missing | missing | missing | |
| 5113 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_084654__481 | 5 | 0.01349 | 47.4372 | 3 | [329, 340] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_084654__481.json | 100.0 | missing | missing | missing | |
| 5114 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_084726__106 | 5 | 0.01547 | 32.1707 | 3 | [329, 406] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_084726__106.json | 100.0 | missing | missing | missing | |
| 5115 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_084743__358 | 0 | 0.01052 | 17.2913 | 0 | [329, 241] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_084743__358.json | 0.0 | missing | missing | missing | |
| 5116 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_084813__584 | 5 | 0.01148 | 29.1498 | 3 | [329, 273] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_084813__584.json | 100.0 | missing | missing | missing | |
| 5117 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_084843__416 | 5 | 0.01622 | 30.5981 | 3 | [329, 431] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_084843__416.json | 100.0 | missing | missing | missing | |
| 5118 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 5 | 20231213_202208__571 | 0 | 0.00552 | 17.5789 | 0 | [69, 161] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231213_202208__571.json | 0.0 | missing | missing | missing | |
| 5119 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 5 | 20231225_190949__894 | 0 | 0.0057 | 12.1448 | 0 | [69, 167] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_190949__894.json | 0.0 | missing | missing | missing | |
| 5120 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 5 | 20231225_190958__549 | 0 | 0.00714 | 9.72434 | 0 | [69, 215] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_190958__549.json | 0.0 | missing | missing | missing | |
| 5121 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview--optim | AsIs | 1SHOT | false | false | 5 | 20231215_193251__105 | 0 | 0.0 | 12.8786 | 0 | [69, 163] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231215_193251__105.json | 0.0 | 0.1 | missing | 0.9 | |
| 5122 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231213_202150__806 | 4 | 0.0084 | 24.8915 | 2 | [72, 256] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231213_202150__806.json | 86.6667 | missing | missing | missing | |
| 5123 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231225_190908__764 | 3 | 0.00945 | 13.6032 | 2 | [72, 291] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_190908__764.json | 81.6667 | missing | missing | missing | |
| 5124 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231225_190936__341 | 4 | 0.00732 | 28.2794 | 2 | [72, 220] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_190936__341.json | 86.6667 | missing | missing | missing | |
| 5125 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231227_194439__472 | 4 | 0.00681 | 14.1468 | 2 | [72, 203] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_194439__472.json | 86.6667 | missing | missing | missing | |
| 5126 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231227_194450__160 | 5 | 0.00678 | 11.5072 | 3 | [72, 202] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_194450__160.json | 100.0 | missing | missing | missing | |
| 5127 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview--optim | InJulia | 1SHOT | true | true | 5 | 20231215_193238__675 | 4 | 0.0 | 18.9136 | 2 | [72, 245] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231215_193238__675.json | 86.6667 | 0.1 | missing | 0.9 | |
| 5128 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_202125__847 | 3 | 0.00395 | 9.42867 | 2 | [107, 96] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231213_202125__847.json | 81.6667 | missing | missing | missing | |
| 5129 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_190850__692 | 3 | 0.00206 | 2.59496 | 2 | [107, 33] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_190850__692.json | 81.6667 | missing | missing | missing | |
| 5130 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_190854__259 | 4 | 0.00356 | 4.20234 | 2 | [107, 83] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_190854__259.json | 86.6667 | missing | missing | missing | |
| 5131 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_194409__761 | 0 | 0.002 | 3.47048 | 0 | [107, 31] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_194409__761.json | 50.0 | missing | missing | missing | |
| 5132 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_194425__216 | 4 | 0.00368 | 15.2047 | 2 | [107, 87] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_194425__216.json | 86.6667 | missing | missing | missing | |
| 5133 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_193219__723 | 3 | 0.0 | 5.09514 | 2 | [107, 67] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231215_193219__723.json | 81.6667 | 0.1 | missing | 0.9 | |
| 5134 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_202116__623 | 4 | 0.00491 | 10.2202 | 2 | [170, 107] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231213_202116__623.json | 86.6667 | missing | missing | missing | |
| 5135 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_190828__531 | 4 | 0.00842 | 14.9753 | 2 | [170, 224] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_190828__531.json | 86.6667 | missing | missing | missing | |
| 5136 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_190847__743 | 4 | 0.00674 | 19.7897 | 2 | [170, 168] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_190847__743.json | 86.6667 | missing | missing | missing | |
| 5137 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_194348__359 | 4 | 0.00824 | 17.4678 | 2 | [170, 218] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_194348__359.json | 86.6667 | missing | missing | missing | |
| 5138 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_194406__418 | 4 | 0.00869 | 17.8975 | 2 | [170, 233] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_194406__418.json | 86.6667 | missing | missing | missing | |
| 5139 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_193214__499 | 4 | 0.0 | 14.6541 | 2 | [170, 204] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231215_193214__499.json | 86.6667 | 0.1 | missing | 0.9 | |
| 5140 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_202302__251 | 5 | 0.01101 | 20.7555 | 3 | [330, 257] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231213_202302__251.json | 100.0 | missing | missing | missing | |
| 5141 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_191030__594 | 5 | 0.0081 | 11.0677 | 3 | [330, 160] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_191030__594.json | 100.0 | missing | missing | missing | |
| 5142 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_191057__501 | 5 | 0.02358 | 26.4001 | 3 | [330, 676] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_191057__501.json | 100.0 | missing | missing | missing | |
| 5143 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_194600__306 | 5 | 0.01197 | 36.2092 | 3 | [330, 289] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_194600__306.json | 100.0 | missing | missing | missing | |
| 5144 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_194625__650 | 5 | 0.0117 | 25.2716 | 3 | [330, 280] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_194625__650.json | 100.0 | missing | missing | missing | |
| 5145 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_193332__344 | 5 | 0.0 | 19.3021 | 3 | [330, 290] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231215_193332__344.json | 100.0 | 0.1 | missing | 0.9 | |
| 5146 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_202241__990 | 5 | 0.01493 | 33.4434 | 3 | [329, 388] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231213_202241__990.json | 100.0 | missing | missing | missing | |
| 5147 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_191009__359 | 5 | 0.01151 | 10.439 | 3 | [329, 274] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_191009__359.json | 100.0 | missing | missing | missing | |
| 5148 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_191019__780 | 5 | 0.01064 | 10.27 | 3 | [329, 245] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_191019__780.json | 100.0 | missing | missing | missing | |
| 5149 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_194502__580 | 5 | 0.00923 | 11.0056 | 3 | [329, 198] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_194502__580.json | 100.0 | missing | missing | missing | |
| 5150 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_194524__340 | 5 | 0.01244 | 22.267 | 3 | [329, 305] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_194524__340.json | 100.0 | missing | missing | missing | |
| 5151 | Apple-MacBook-Pro-M1 | clean_column | gpt-4-1106-preview--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_193312__493 | 5 | 0.0 | 21.4222 | 3 | [329, 278] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231215_193312__493.json | 100.0 | 0.1 | missing | 0.9 | |
| 5152 | Apple-MacBook-Pro-M1 | clean_column | llama2 | AsIs | 1SHOT | false | false | 5 | 20231214_002702__852 | 0 | 0.0 | 6.57732 | 0 | [62, 198] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__AsIs__1SHOT__20231214_002702__852.json | 0.0 | missing | missing | missing | |
| 5153 | Apple-MacBook-Pro-M1 | clean_column | llama2 | AsIs | 1SHOT | false | false | 5 | 20231225_024650__868 | 0 | 0.0 | 7.13112 | 0 | [62, 215] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__AsIs__1SHOT__20231225_024650__868.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5154 | Apple-MacBook-Pro-M1 | clean_column | llama2 | AsIs | 1SHOT | false | false | 5 | 20231225_024658__214 | 0 | 0.0 | 8.012 | 0 | [1, 254] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__AsIs__1SHOT__20231225_024658__214.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5155 | Apple-MacBook-Pro-M1 | clean_column | llama2 | InJulia | 1SHOT | false | false | 5 | 20231214_002656__193 | 0 | 0.0 | 13.9154 | 0 | [79, 412] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__InJulia__1SHOT__20231214_002656__193.json | 0.0 | missing | missing | missing | |
| 5156 | Apple-MacBook-Pro-M1 | clean_column | llama2 | InJulia | 1SHOT | false | false | 5 | 20231225_024631__437 | 0 | 0.0 | 11.677 | 0 | [79, 347] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__InJulia__1SHOT__20231225_024631__437.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5157 | Apple-MacBook-Pro-M1 | clean_column | llama2 | InJulia | 1SHOT | false | false | 5 | 20231225_024643__860 | 0 | 0.0 | 11.6512 | 0 | [1, 362] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__InJulia__1SHOT__20231225_024643__860.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5158 | Apple-MacBook-Pro-M1 | clean_column | llama2 | InJulia | 1SHOT | true | true | 5 | 20231226_232107__868 | 0 | 0.0 | 13.0693 | 0 | [79, 393] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__InJulia__1SHOT__20231226_232107__868.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5159 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_002642__295 | 0 | 0.0 | 6.9509 | 0 | [108, 195] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaExpertAsk__1SHOT__20231214_002642__295.json | 0.0 | missing | missing | missing | |
| 5160 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_024613__193 | 0 | 0.0 | 9.84125 | 0 | [108, 282] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_024613__193.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5161 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_024619__557 | 0 | 0.0 | 5.85733 | 0 | [1, 184] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_024619__557.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5162 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_232054__497 | 0 | 0.0 | 5.00478 | 0 | [108, 137] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaExpertAsk__1SHOT__20231226_232054__497.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5163 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_002635__478 | 0 | 0.0 | 14.9756 | 0 | [184, 404] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_002635__478.json | 25.0 | missing | missing | missing | |
| 5164 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_024550__950 | 0 | 0.0 | 22.3612 | 0 | [202, 461] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_024550__950.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5165 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_024603__767 | 0 | 0.0 | 13.4483 | 0 | [1, 399] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_024603__767.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5166 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_232049__356 | 0 | 0.0 | 13.433 | 0 | [202, 223] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_232049__356.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5167 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_002740__233 | 0 | 0.0 | 18.6891 | 0 | [11, 508] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_002740__233.json | 50.0 | missing | missing | missing | |
| 5168 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_024742__666 | 0 | 0.0 | 10.0879 | 0 | [11, 281] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_024742__666.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5169 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_024755__410 | 0 | 0.0 | 12.0584 | 0 | [1, 339] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_024755__410.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5170 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_232135__705 | 0 | 0.0 | 12.4074 | 0 | [11, 349] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_232135__705.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5171 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_002721__840 | 0 | 0.0 | 18.7336 | 0 | [379, 426] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaRecapTask__1SHOT__20231214_002721__840.json | 0.0 | missing | missing | missing | |
| 5172 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_024715__310 | 0 | 0.0 | 17.1737 | 0 | [379, 385] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_024715__310.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5173 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_024732__898 | 0 | 0.0 | 17.3048 | 0 | [1, 477] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_024732__898.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5174 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_232123__409 | 0 | 0.0 | 15.6868 | 0 | [379, 351] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaRecapTask__1SHOT__20231226_232123__409.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5175 | Apple-MacBook-Pro-M1 | clean_column | magicoder | AsIs | 1SHOT | false | false | 5 | 20231214_003426__974 | 0 | 0.0 | 7.67888 | 0 | [62, 232] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__AsIs__1SHOT__20231214_003426__974.json | 0.0 | missing | missing | missing | |
| 5176 | Apple-MacBook-Pro-M1 | clean_column | magicoder | AsIs | 1SHOT | false | false | 5 | 20231225_030246__993 | 0 | 0.0 | 5.11991 | 0 | [76, 162] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__AsIs__1SHOT__20231225_030246__993.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5177 | Apple-MacBook-Pro-M1 | clean_column | magicoder | AsIs | 1SHOT | false | false | 5 | 20231225_030249__440 | 0 | 0.0 | 3.25394 | 0 | [76, 98] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__AsIs__1SHOT__20231225_030249__440.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5178 | Apple-MacBook-Pro-M1 | clean_column | magicoder | InJulia | 1SHOT | true | true | 5 | 20231214_003418__323 | 0 | 0.0 | 14.2008 | 0 | [79, 420] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__InJulia__1SHOT__20231214_003418__323.json | 50.0 | missing | missing | missing | |
| 5179 | Apple-MacBook-Pro-M1 | clean_column | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_030235__823 | 3 | 0.0 | 5.59543 | 2 | [79, 178] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__InJulia__1SHOT__20231225_030235__823.json | 81.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5180 | Apple-MacBook-Pro-M1 | clean_column | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_030241__339 | 5 | 0.0 | 5.70766 | 3 | [79, 182] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__InJulia__1SHOT__20231225_030241__339.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5181 | Apple-MacBook-Pro-M1 | clean_column | magicoder | InJulia | 1SHOT | true | true | 5 | 20231226_232758__810 | 4 | 0.0 | 3.51801 | 2 | [79, 107] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__InJulia__1SHOT__20231226_232758__810.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5182 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_003404__362 | 0 | 0.0 | 5.89981 | 0 | [108, 162] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231214_003404__362.json | 50.0 | missing | missing | missing | |
| 5183 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_030225__259 | 0 | 0.0 | 7.82215 | 0 | [118, 247] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_030225__259.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5184 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_030230__239 | 5 | 0.0 | 4.30858 | 3 | [118, 129] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_030230__239.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5185 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_232755__843 | 4 | 0.0 | 6.0798 | 2 | [118, 189] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231226_232755__843.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5186 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_003358__637 | 0 | 0.0 | 19.087 | 0 | [184, 518] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231214_003358__637.json | 0.0 | missing | missing | missing | |
| 5187 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_030210__421 | 0 | 0.0 | 10.2515 | 0 | [194, 112] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_030210__421.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5188 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_030217__252 | 0 | 0.0 | 7.12968 | 0 | [194, 209] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_030217__252.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5189 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_232748__345 | 0 | 0.0 | 12.2968 | 2 | [194, 189] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231226_232748__345.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5190 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_003449__640 | 0 | 0.0 | 12.5499 | 0 | [11, 347] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231214_003449__640.json | 50.0 | missing | missing | missing | |
| 5191 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_030313__585 | 0 | 0.0 | 8.611 | 0 | [382, 228] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_030313__585.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5192 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_030322__600 | 5 | 0.0 | 8.50329 | 3 | [382, 225] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_030322__600.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5193 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_232813__898 | 5 | 0.0 | 7.71641 | 3 | [382, 200] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231226_232813__898.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5194 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_003437__894 | 0 | 0.0 | 10.5505 | 0 | [379, 205] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaRecapTask__1SHOT__20231214_003437__894.json | 50.0 | missing | missing | missing | |
| 5195 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_030258__162 | 5 | 0.0 | 8.82398 | 3 | [379, 234] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_030258__162.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5196 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_030305__490 | 0 | 0.0 | 6.40602 | 0 | [379, 157] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_030305__490.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5197 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_232805__639 | 5 | 0.0 | 7.22057 | 3 | [379, 184] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaRecapTask__1SHOT__20231226_232805__639.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5198 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_180147__266 | 0 | 0.0 | 20.6072 | 0 | [79, 401] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_180147__266.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5199 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_180150__791 | 0 | 0.0 | 3.1911 | 0 | [79, 54] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_180150__791.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5200 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_180158__589 | 5 | 0.0 | 7.64367 | 3 | [79, 144] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_180158__589.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5201 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_180117__530 | 0 | 0.0 | 4.65145 | 0 | [118, 80] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_180117__530.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5202 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_180120__700 | 5 | 0.0 | 3.21323 | 3 | [118, 51] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_180120__700.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5203 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_180126__709 | 0 | 0.0 | 6.18565 | 2 | [118, 111] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_180126__709.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5204 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_180100__251 | 4 | 0.0 | 7.70154 | 3 | [194, 131] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_180100__251.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5205 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_180109__250 | 5 | 0.0 | 9.56283 | 3 | [194, 168] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_180109__250.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5206 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_180112__476 | 0 | 0.0 | 2.94795 | 0 | [194, 36] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_180112__476.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5207 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_180241__343 | 3 | 0.0 | 14.337 | 2 | [382, 241] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_180241__343.json | 81.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5208 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_180251__784 | 4 | 0.0 | 9.53273 | 2 | [382, 148] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_180251__784.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5209 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_180258__760 | 5 | 0.0 | 7.23119 | 3 | [382, 103] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_180258__760.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5210 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_180208__240 | 0 | 0.0 | 10.1571 | 0 | [379, 160] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_180208__240.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5211 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_180219__847 | 3 | 0.0 | 10.7171 | 2 | [379, 171] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_180219__847.json | 81.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5212 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_180227__149 | 5 | 0.0 | 7.99893 | 3 | [379, 118] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_180227__149.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5213 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | AsIs | 1SHOT | false | false | 5 | 20231213_202613__179 | 0 | 0.00168027 | 26.9509 | 0 | [74, 183] | 0.10.0-DEV | 3 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__AsIs__1SHOT__20231213_202613__179.json | 0.0 | missing | missing | missing | |
| 5214 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | AsIs | 1SHOT | false | false | 5 | 20231225_191308__862 | 0 | 0.00310411 | 8.02928 | 0 | [74, 359] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__AsIs__1SHOT__20231225_191308__862.json | 0.0 | missing | missing | missing | |
| 5215 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | AsIs | 1SHOT | false | false | 5 | 20231225_191324__199 | 0 | 0.00252972 | 16.5116 | 0 | [74, 288] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__AsIs__1SHOT__20231225_191324__199.json | 0.0 | missing | missing | missing | |
| 5216 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium--optim | AsIs | 1SHOT | false | false | 5 | 20231215_193427__468 | 0 | 0.0 | 3.45175 | 0 | [74, 155] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__AsIs__1SHOT__20231215_193427__468.json | 0.0 | 0.9 | missing | 0.3 | |
| 5217 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231213_202546__646 | 5 | 0.00261872 | 43.4065 | 3 | [77, 298] | 0.10.0-DEV | 3 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__InJulia__1SHOT__20231213_202546__646.json | 100.0 | missing | missing | missing | |
| 5218 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231225_191250__747 | 5 | 0.0016641 | 8.02182 | 3 | [77, 180] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__InJulia__1SHOT__20231225_191250__747.json | 100.0 | missing | missing | missing | |
| 5219 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231225_191300__623 | 0 | 0.00253782 | 9.70754 | 3 | [77, 288] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__InJulia__1SHOT__20231225_191300__623.json | 75.0 | missing | missing | missing | |
| 5220 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231227_194828__427 | 5 | 0.00194725 | 4.7905 | 3 | [77, 215] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__InJulia__1SHOT__20231227_194828__427.json | 100.0 | missing | missing | missing | |
| 5221 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231227_194840__112 | 5 | 0.00201197 | 11.6892 | 3 | [77, 223] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__InJulia__1SHOT__20231227_194840__112.json | 100.0 | missing | missing | missing | |
| 5222 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium--optim | InJulia | 1SHOT | true | true | 5 | 20231215_193423__967 | 5 | 0.0 | 4.07302 | 3 | [77, 183] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__InJulia__1SHOT__20231215_193423__967.json | 100.0 | 0.9 | missing | 0.3 | |
| 5223 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_202502__921 | 5 | 0.00203637 | 32.136 | 3 | [116, 213] | 0.10.0-DEV | 3 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231213_202502__921.json | 100.0 | missing | missing | missing | |
| 5224 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_191238__245 | 5 | 0.00202019 | 4.76322 | 3 | [116, 211] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_191238__245.json | 100.0 | missing | missing | missing | |
| 5225 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_191242__955 | 5 | 0.00142153 | 3.22011 | 3 | [116, 137] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_191242__955.json | 100.0 | missing | missing | missing | |
| 5226 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_194820__129 | 5 | 0.0012031 | 2.55765 | 3 | [116, 110] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_194820__129.json | 100.0 | missing | missing | missing | |
| 5227 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_194823__694 | 5 | 0.00138108 | 3.12343 | 3 | [116, 132] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_194823__694.json | 100.0 | missing | missing | missing | |
| 5228 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_193419__106 | 5 | 0.0 | 4.5598 | 3 | [116, 125] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231215_193419__106.json | 100.0 | 0.9 | missing | 0.3 | |
| 5229 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_202430__451 | 0 | 0.00330136 | 46.4589 | 0 | [192, 344] | 0.10.0-DEV | 3 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231213_202430__451.json | 50.0 | missing | missing | missing | |
| 5230 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_191227__170 | 5 | 0.00242764 | 5.74822 | 3 | [192, 236] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_191227__170.json | 100.0 | missing | missing | missing | |
| 5231 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_191234__246 | 5 | 0.00279978 | 6.34343 | 3 | [192, 282] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_191234__246.json | 100.0 | missing | missing | missing | |
| 5232 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_194803__203 | 5 | 0.00258944 | 20.1651 | 3 | [192, 256] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_194803__203.json | 100.0 | missing | missing | missing | |
| 5233 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_194817__415 | 5 | 0.00331754 | 14.3325 | 3 | [192, 346] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_194817__415.json | 100.0 | missing | missing | missing | |
| 5234 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_193415__619 | 5 | 0.0 | 6.54296 | 3 | [192, 290] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231215_193415__619.json | 100.0 | 0.9 | missing | 0.3 | |
| 5235 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231213_202712__699 | 0 | 0.00188893 | 17.5437 | 0 | [379, 107] | 0.10.0-DEV | 3 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231213_202712__699.json | 0.0 | missing | missing | missing | |
| 5236 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_191348__846 | 5 | 0.00414604 | 8.84242 | 3 | [379, 386] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_191348__846.json | 100.0 | missing | missing | missing | |
| 5237 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_191408__192 | 0 | 0.00454245 | 20.5812 | 0 | [379, 435] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_191408__192.json | 0.0 | missing | missing | missing | |
| 5238 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_194857__342 | 0 | 0.00249568 | 4.27166 | 0 | [379, 182] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_194857__342.json | 0.0 | missing | missing | missing | |
| 5239 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_194907__143 | 5 | 0.00448582 | 9.76725 | 3 | [379, 428] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_194907__143.json | 100.0 | missing | missing | missing | |
| 5240 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_193440__485 | 5 | 0.0 | 6.51334 | 3 | [379, 285] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231215_193440__485.json | 100.0 | 0.9 | missing | 0.3 | |
| 5241 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_202655__838 | 0 | 0.00332085 | 42.1628 | 0 | [376, 285] | 0.10.0-DEV | 3 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231213_202655__838.json | 50.0 | missing | missing | missing | |
| 5242 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_191333__229 | 5 | 0.00395996 | 8.40053 | 3 | [376, 364] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_191333__229.json | 100.0 | missing | missing | missing | |
| 5243 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_191339__504 | 0 | 0.0031186 | 6.04324 | 0 | [376, 260] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_191339__504.json | 50.0 | missing | missing | missing | |
| 5244 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_194847__845 | 5 | 0.00340175 | 6.79817 | 3 | [376, 295] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_194847__845.json | 100.0 | missing | missing | missing | |
| 5245 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_194853__153 | 5 | 0.00307815 | 5.87627 | 3 | [376, 255] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_194853__153.json | 100.0 | missing | missing | missing | |
| 5246 | Apple-MacBook-Pro-M1 | clean_column | mistral-medium--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_193434__705 | 5 | 0.0 | 6.92636 | 3 | [376, 306] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231215_193434__705.json | 100.0 | 0.9 | missing | 0.3 | |
| 5247 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | AsIs | 1SHOT | false | false | 5 | 20231213_202337__495 | 0 | 0.000299431 | 1.90987 | 0 | [73, 130] | 0.10.0-DEV | 3 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__AsIs__1SHOT__20231213_202337__495.json | 0.0 | missing | missing | missing | |
| 5248 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | AsIs | 1SHOT | false | false | 5 | 20231225_191158__415 | 0 | 0.000363451 | 2.37026 | 0 | [73, 163] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__AsIs__1SHOT__20231225_191158__415.json | 0.0 | missing | missing | missing | |
| 5249 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | AsIs | 1SHOT | false | false | 5 | 20231225_191200__880 | 0 | 0.000293611 | 1.8724 | 0 | [73, 127] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__AsIs__1SHOT__20231225_191200__880.json | 0.0 | missing | missing | missing | |
| 5250 | Apple-MacBook-Pro-M1 | clean_column | mistral-small--optim | AsIs | 1SHOT | false | false | 5 | 20231215_193357__660 | 0 | 0.0 | 3.48369 | 0 | [73, 257] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__AsIs__1SHOT__20231215_193357__660.json | 0.0 | 0.9 | missing | 0.3 | |
| 5251 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231213_202335__273 | 5 | 0.000514772 | 3.3472 | 3 | [76, 240] | 0.10.0-DEV | 3 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__InJulia__1SHOT__20231213_202335__273.json | 100.0 | missing | missing | missing | |
| 5252 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231225_191153__964 | 5 | 0.000390612 | 2.53073 | 3 | [76, 176] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__InJulia__1SHOT__20231225_191153__964.json | 100.0 | missing | missing | missing | |
| 5253 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231225_191156__127 | 5 | 0.000365392 | 2.30371 | 3 | [76, 163] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__InJulia__1SHOT__20231225_191156__127.json | 100.0 | missing | missing | missing | |
| 5254 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231227_194716__900 | 5 | 0.000510892 | 3.30783 | 3 | [76, 238] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__InJulia__1SHOT__20231227_194716__900.json | 100.0 | missing | missing | missing | |
| 5255 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231227_194719__406 | 5 | 0.000377032 | 2.48921 | 3 | [76, 169] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__InJulia__1SHOT__20231227_194719__406.json | 100.0 | missing | missing | missing | |
| 5256 | Apple-MacBook-Pro-M1 | clean_column | mistral-small--optim | InJulia | 1SHOT | true | true | 5 | 20231215_193354__409 | 5 | 0.0 | 2.43504 | 3 | [76, 183] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__InJulia__1SHOT__20231215_193354__409.json | 100.0 | 0.9 | missing | 0.3 | |
| 5257 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231213_202332__346 | 0 | 0.000446239 | 2.71636 | 0 | [117, 191] | 0.10.0-DEV | 3 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231213_202332__346.json | 25.0 | missing | missing | missing | |
| 5258 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_191150__587 | 5 | 0.000384159 | 2.32976 | 3 | [117, 159] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_191150__587.json | 100.0 | missing | missing | missing | |
| 5259 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_191151__816 | 4 | 0.000153299 | 0.735137 | 3 | [117, 40] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_191151__816.json | 95.0 | missing | missing | missing | |
| 5260 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_194711__517 | 4 | 0.000153299 | 7.04269 | 3 | [117, 40] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_194711__517.json | 95.0 | missing | missing | missing | |
| 5261 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_194713__513 | 0 | 0.000310439 | 1.89127 | 0 | [117, 121] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_194713__513.json | 50.0 | missing | missing | missing | |
| 5262 | Apple-MacBook-Pro-M1 | clean_column | mistral-small--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_193351__617 | 4 | 0.0 | 0.723179 | 3 | [117, 40] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231215_193351__617.json | 95.0 | 0.9 | missing | 0.3 | |
| 5263 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_202329__274 | 5 | 0.000691351 | 4.01417 | 3 | [193, 292] | 0.10.0-DEV | 3 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231213_202329__274.json | 100.0 | missing | missing | missing | |
| 5264 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_191142__482 | 0 | 0.000734031 | 4.32615 | 0 | [193, 314] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_191142__482.json | 50.0 | missing | missing | missing | |
| 5265 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_191147__854 | 5 | 0.000763131 | 4.61878 | 3 | [193, 329] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_191147__854.json | 100.0 | missing | missing | missing | |
| 5266 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_194659__923 | 0 | 0.000670011 | 3.98941 | 0 | [193, 281] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_194659__923.json | 50.0 | missing | missing | missing | |
| 5267 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_194704__317 | 5 | 0.000815511 | 4.86236 | 3 | [193, 356] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_194704__317.json | 100.0 | missing | missing | missing | |
| 5268 | Apple-MacBook-Pro-M1 | clean_column | mistral-small--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_193350__462 | 5 | 0.0 | 4.20957 | 3 | [193, 310] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231215_193350__462.json | 100.0 | 0.9 | missing | 0.3 | |
| 5269 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_202343__681 | 5 | 0.000628041 | 2.80999 | 3 | [383, 196] | 0.10.0-DEV | 3 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231213_202343__681.json | 100.0 | missing | missing | missing | |
| 5270 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_191217__510 | 5 | 0.000792941 | 4.00368 | 3 | [383, 281] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_191217__510.json | 100.0 | missing | missing | missing | |
| 5271 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_191221__296 | 5 | 0.000822041 | 4.06869 | 3 | [383, 296] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_191221__296.json | 100.0 | missing | missing | missing | |
| 5272 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_194740__970 | 5 | 0.000676541 | 3.14328 | 3 | [383, 221] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_194740__970.json | 100.0 | missing | missing | missing | |
| 5273 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_194743__592 | 0 | 0.000682361 | 3.21523 | 0 | [383, 224] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_194743__592.json | 50.0 | missing | missing | missing | |
| 5274 | Apple-MacBook-Pro-M1 | clean_column | mistral-small--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_193408__323 | 5 | 0.0 | 6.94791 | 3 | [383, 518] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231215_193408__323.json | 100.0 | 0.9 | missing | 0.3 | |
| 5275 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_202340__482 | 5 | 0.000717927 | 3.36801 | 3 | [381, 243] | 0.10.0-DEV | 3 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231213_202340__482.json | 100.0 | missing | missing | missing | |
| 5276 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_191206__151 | 0 | 0.00112533 | 6.32033 | 0 | [381, 453] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_191206__151.json | 50.0 | missing | missing | missing | |
| 5277 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_191213__665 | 5 | 0.00119323 | 6.66779 | 3 | [381, 488] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_191213__665.json | 100.0 | missing | missing | missing | |
| 5278 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_194729__711 | 5 | 0.000780007 | 10.3927 | 3 | [381, 275] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_194729__711.json | 100.0 | missing | missing | missing | |
| 5279 | Apple-MacBook-Pro-M1 | clean_column | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_194736__580 | 0 | 0.00125919 | 7.13474 | 0 | [381, 522] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_194736__580.json | 50.0 | missing | missing | missing | |
| 5280 | Apple-MacBook-Pro-M1 | clean_column | mistral-small--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_193401__226 | 5 | 0.0 | 3.85864 | 3 | [381, 287] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231215_193401__226.json | 100.0 | 0.9 | missing | 0.3 | |
| 5281 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231213_202316__569 | 0 | 0.000113051 | 3.25076 | 0 | [73, 227] | 0.10.0-DEV | 3 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__AsIs__1SHOT__20231213_202316__569.json | 0.0 | missing | missing | missing | |
| 5282 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231225_191120__713 | 0 | 0.000124376 | 3.05307 | 0 | [73, 252] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__AsIs__1SHOT__20231225_191120__713.json | 0.0 | missing | missing | missing | |
| 5283 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231225_191122__759 | 0 | 0.000129812 | 2.36363 | 0 | [73, 264] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__AsIs__1SHOT__20231225_191122__759.json | 0.0 | missing | missing | missing | |
| 5284 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny--optim | AsIs | 1SHOT | false | false | 5 | 20231215_193341__361 | 0 | 0.0 | 2.44149 | 0 | [73, 293] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__AsIs__1SHOT__20231215_193341__361.json | 0.0 | 0.9 | missing | 0.3 | |
| 5285 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231213_202313__123 | 5 | 8.9462e-5 | 3.2843 | 3 | [76, 174] | 0.10.0-DEV | 3 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__InJulia__1SHOT__20231213_202313__123.json | 100.0 | missing | missing | missing | |
| 5286 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231225_191113__304 | 5 | 0.000111659 | 2.06061 | 3 | [76, 223] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__InJulia__1SHOT__20231225_191113__304.json | 100.0 | missing | missing | missing | |
| 5287 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231225_191117__771 | 0 | 0.000138386 | 3.43786 | 0 | [76, 282] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__InJulia__1SHOT__20231225_191117__771.json | 50.0 | missing | missing | missing | |
| 5288 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231227_194642__362 | 5 | 9.5351e-5 | 1.69466 | 3 | [76, 187] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__InJulia__1SHOT__20231227_194642__362.json | 100.0 | missing | missing | missing | |
| 5289 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231227_194645__419 | 5 | 0.000138386 | 2.5775 | 3 | [76, 282] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__InJulia__1SHOT__20231227_194645__419.json | 100.0 | missing | missing | missing | |
| 5290 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny--optim | InJulia | 1SHOT | true | true | 5 | 20231215_193338__177 | 5 | 0.0 | 1.32162 | 3 | [76, 155] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__InJulia__1SHOT__20231215_193338__177.json | 100.0 | 0.9 | missing | 0.3 | |
| 5291 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_202310__651 | 5 | 3.9483e-5 | 1.03725 | 3 | [117, 51] | 0.10.0-DEV | 3 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231213_202310__651.json | 100.0 | missing | missing | missing | |
| 5292 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_191109__269 | 4 | 3.5406e-5 | 0.534935 | 3 | [117, 42] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_191109__269.json | 95.0 | missing | missing | missing | |
| 5293 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_191111__576 | 0 | 0.000117399 | 2.16359 | 3 | [117, 223] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_191111__576.json | 75.0 | missing | missing | missing | |
| 5294 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_194639__380 | 5 | 7.8894e-5 | 1.42476 | 3 | [117, 138] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_194639__380.json | 100.0 | missing | missing | missing | |
| 5295 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_194640__454 | 5 | 5.5338e-5 | 0.914826 | 3 | [117, 86] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_194640__454.json | 100.0 | missing | missing | missing | |
| 5296 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_193337__212 | 5 | 0.0 | 0.542947 | 3 | [117, 51] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231215_193337__212.json | 100.0 | 0.9 | missing | 0.3 | |
| 5297 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_202309__889 | 0 | 0.00016292 | 6.58491 | 0 | [193, 300] | 0.10.0-DEV | 3 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231213_202309__889.json | 50.0 | missing | missing | missing | |
| 5298 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_191106__414 | 0 | 0.00014933 | 8.96411 | 0 | [193, 270] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_191106__414.json | 50.0 | missing | missing | missing | |
| 5299 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_191108__636 | 5 | 0.000129398 | 2.10717 | 3 | [193, 226] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_191108__636.json | 100.0 | missing | missing | missing | |
| 5300 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_194635__279 | 0 | 0.000171074 | 9.39797 | 0 | [193, 318] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_194635__279.json | 25.0 | missing | missing | missing | |
| 5301 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_194638__197 | 5 | 0.000170168 | 2.80515 | 3 | [193, 316] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_194638__197.json | 100.0 | missing | missing | missing | |
| 5302 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_193336__969 | 0 | 0.0 | 4.27285 | 0 | [193, 258] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231215_193336__969.json | 50.0 | 0.9 | missing | 0.3 | |
| 5303 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_202325__765 | 5 | 0.000199939 | 4.14732 | 3 | [383, 323] | 0.10.0-DEV | 3 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231213_202325__765.json | 100.0 | missing | missing | missing | |
| 5304 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_191134__468 | 5 | 0.000190879 | 3.43317 | 3 | [383, 303] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_191134__468.json | 100.0 | missing | missing | missing | |
| 5305 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_191138__163 | 0 | 0.000202657 | 4.14812 | 0 | [383, 329] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_191138__163.json | 0.0 | missing | missing | missing | |
| 5306 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_194651__294 | 0 | 0.000129724 | 1.61192 | 0 | [383, 168] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_194651__294.json | 50.0 | missing | missing | missing | |
| 5307 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_194655__139 | 0 | 0.000217153 | 3.32764 | 0 | [383, 361] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_194655__139.json | 50.0 | missing | missing | missing | |
| 5308 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_193346__651 | 0 | 0.0 | 2.74515 | 0 | [383, 316] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231215_193346__651.json | 50.0 | 0.9 | missing | 0.3 | |
| 5309 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_202321__258 | 5 | 0.000181992 | 4.20314 | 3 | [381, 284] | 0.10.0-DEV | 3 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231213_202321__258.json | 100.0 | missing | missing | missing | |
| 5310 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_191126__743 | 5 | 0.000196488 | 3.81532 | 3 | [381, 316] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_191126__743.json | 100.0 | missing | missing | missing | |
| 5311 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_191130__277 | 0 | 0.000209172 | 4.25252 | 3 | [381, 344] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_191130__277.json | 75.0 | missing | missing | missing | |
| 5312 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_194648__840 | 5 | 0.000205548 | 3.10484 | 3 | [381, 336] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_194648__840.json | 100.0 | missing | missing | missing | |
| 5313 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_194650__995 | 5 | 0.000140316 | 1.79348 | 3 | [381, 192] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_194650__995.json | 100.0 | missing | missing | missing | |
| 5314 | Apple-MacBook-Pro-M1 | clean_column | mistral-tiny--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_193343__510 | 5 | 0.0 | 2.38747 | 3 | [381, 276] | 0.10.0-DEV | 3 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231215_193343__510.json | 100.0 | 0.9 | missing | 0.3 | |
| 5315 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_230702__134 | 0 | 0.0 | 6.49401 | 0 | [62, 196] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_230702__134.json | 0.0 | missing | missing | missing | |
| 5316 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_230707__952 | 0 | 0.0 | 5.04368 | 0 | [1, 163] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_230707__952.json | 0.0 | missing | missing | missing | |
| 5317 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_230712__616 | 0 | 0.0 | 5.21166 | 0 | [1, 168] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_230712__616.json | 0.0 | missing | missing | missing | |
| 5318 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_230638__597 | 0 | 0.0 | 13.2083 | 0 | [79, 393] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_230638__597.json | 0.0 | missing | missing | missing | |
| 5319 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231219_230644__531 | 0 | 0.0 | 5.69774 | 0 | [1, 183] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_230644__531.json | 50.0 | missing | missing | missing | |
| 5320 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_230656__809 | 0 | 0.0 | 11.964 | 0 | [1, 373] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_230656__809.json | 0.0 | missing | missing | missing | |
| 5321 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231226_233723__567 | 0 | 0.0 | 2.74645 | 0 | [75, 59] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231226_233723__567.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5322 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_230615__680 | 0 | 0.0 | 9.26727 | 0 | [108, 266] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_230615__680.json | 50.0 | missing | missing | missing | |
| 5323 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_230620__238 | 0 | 0.0 | 4.95988 | 0 | [1, 158] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_230620__238.json | 0.0 | missing | missing | missing | |
| 5324 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_230625__199 | 0 | 0.0 | 4.96674 | 0 | [1, 158] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_230625__199.json | 50.0 | missing | missing | missing | |
| 5325 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_233720__557 | 0 | 0.0 | 8.8399 | 0 | [116, 213] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_233720__557.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5326 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_230540__998 | 0 | 0.0 | 14.5892 | 0 | [184, 395] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_230540__998.json | 25.0 | missing | missing | missing | |
| 5327 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231219_230556__172 | 0 | 0.0 | 15.6004 | 0 | [1, 460] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_230556__172.json | 50.0 | missing | missing | missing | |
| 5328 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_230605__994 | 0 | 0.0 | 9.68826 | 0 | [1, 294] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_230605__994.json | 0.0 | missing | missing | missing | |
| 5329 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_233711__874 | 0 | 0.0 | 9.21638 | 0 | [192, 75] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_233711__874.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5330 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_230818__847 | 0 | 0.0 | 20.0098 | 0 | [11, 542] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_230818__847.json | 50.0 | missing | missing | missing | |
| 5331 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_230834__676 | 0 | 0.0 | 15.8856 | 0 | [1, 442] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_230834__676.json | 50.0 | missing | missing | missing | |
| 5332 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_230851__713 | 0 | 0.0 | 16.5733 | 0 | [1, 460] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_230851__713.json | 50.0 | missing | missing | missing | |
| 5333 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_233756__166 | 0 | 0.0 | 18.8073 | 0 | [383, 421] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_233756__166.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5334 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_230730__644 | 0 | 0.0 | 17.5038 | 0 | [379, 395] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_230730__644.json | 50.0 | missing | missing | missing | |
| 5335 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_230747__478 | 0 | 0.0 | 17.1873 | 0 | [1, 476] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_230747__478.json | 50.0 | missing | missing | missing | |
| 5336 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_230758__568 | 0 | 0.0 | 11.0129 | 0 | [1, 313] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_230758__568.json | 50.0 | missing | missing | missing | |
| 5337 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_233737__246 | 0 | 0.0 | 14.5761 | 0 | [381, 317] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_233737__246.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5338 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_231840__227 | 5 | 0.0 | 7.03798 | 3 | [74, 219] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_231840__227.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5339 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | false | 5 | 20231227_231847__484 | 0 | 0.0 | 7.16018 | 0 | [74, 223] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_231847__484.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5340 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_231854__195 | 0 | 0.0 | 6.88232 | 3 | [74, 214] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_231854__195.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5341 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_231903__887 | 0 | 0.0 | 8.85004 | 0 | [74, 278] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_231903__887.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5342 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_231914__776 | 0 | 0.0 | 11.4097 | 0 | [74, 361] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_231914__776.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5343 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_231825__828 | 5 | 0.0 | 2.13424 | 3 | [115, 51] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_231825__828.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5344 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_231827__426 | 5 | 0.0 | 1.93383 | 3 | [115, 45] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_231827__426.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5345 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_231829__509 | 5 | 0.0 | 1.95376 | 3 | [115, 45] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_231829__509.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5346 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_231831__451 | 5 | 0.0 | 2.02685 | 3 | [115, 48] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_231831__451.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5347 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_231833__724 | 0 | 0.0 | 1.90211 | 0 | [115, 44] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_231833__724.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5348 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_231750__982 | 0 | 0.0 | 12.2811 | 0 | [191, 344] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231750__982.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5349 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_231759__170 | 0 | 0.0 | 8.89832 | 3 | [191, 262] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231759__170.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5350 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_231806__355 | 5 | 0.0 | 7.66764 | 3 | [191, 222] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231806__355.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5351 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_231810__852 | 5 | 0.0 | 3.72964 | 3 | [191, 94] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231810__852.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5352 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_231823__218 | 0 | 0.0 | 12.3573 | 3 | [191, 372] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231823__218.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5353 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_232017__175 | 0 | 0.0 | 12.7513 | 0 | [382, 348] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_232017__175.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5354 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_232030__877 | 5 | 0.0 | 12.9383 | 3 | [382, 353] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_232030__877.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5355 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_232041__473 | 4 | 0.0 | 10.1345 | 3 | [382, 267] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_232041__473.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5356 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_232051__464 | 5 | 0.0 | 10.0372 | 3 | [382, 264] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_232051__464.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5357 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_232102__110 | 0 | 0.0 | 10.9077 | 0 | [382, 291] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_232102__110.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5358 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_231925__779 | 0 | 0.0 | 10.2035 | 0 | [380, 269] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_231925__779.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5359 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_231935__600 | 5 | 0.0 | 10.1884 | 3 | [380, 269] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_231935__600.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5360 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_231946__281 | 5 | 0.0 | 11.3165 | 3 | [380, 304] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_231946__281.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5361 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_231954__678 | 5 | 0.0 | 7.16017 | 3 | [380, 174] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_231954__678.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5362 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_232005__316 | 5 | 0.0 | 10.8838 | 3 | [380, 291] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_232005__316.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5363 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231227_232215__524 | 0 | 0.0 | 8.23284 | 0 | [74, 202] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_232215__524.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5364 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_232226__809 | 5 | 0.0 | 11.6467 | 3 | [74, 290] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_232226__809.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5365 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_232235__409 | 0 | 0.0 | 8.97181 | 0 | [74, 221] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_232235__409.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5366 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_232246__219 | 0 | 0.0 | 10.0905 | 0 | [74, 250] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_232246__219.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5367 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_232255__319 | 5 | 0.0 | 8.93829 | 3 | [74, 220] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_232255__319.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5368 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_232146__348 | 5 | 0.0 | 3.02144 | 3 | [115, 61] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_232146__348.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5369 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_232152__990 | 5 | 0.0 | 5.62216 | 3 | [115, 129] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_232152__990.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5370 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_232157__319 | 0 | 0.0 | 5.5898 | 0 | [115, 128] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_232157__319.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5371 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_232201__366 | 5 | 0.0 | 3.47311 | 3 | [115, 73] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_232201__366.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5372 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_232206__232 | 0 | 0.0 | 5.6987 | 0 | [115, 131] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_232206__232.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5373 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_232108__428 | 4 | 0.0 | 6.00559 | 2 | [191, 108] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_232108__428.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5374 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_232112__949 | 2 | 0.0 | 4.12843 | 3 | [191, 81] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_232112__949.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5375 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_232122__130 | 0 | 0.0 | 10.4541 | 0 | [191, 243] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_232122__130.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5376 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_232130__682 | 0 | 0.0 | 7.52332 | 0 | [191, 168] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_232130__682.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5377 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_232143__539 | 0 | 0.0 | 12.7672 | 0 | [191, 302] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_232143__539.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5378 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_232412__869 | 0 | 0.0 | 15.281 | 0 | [382, 333] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_232412__869.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5379 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_232427__824 | 0 | 0.0 | 14.9574 | 3 | [382, 325] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_232427__824.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5380 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_232442__809 | 0 | 0.0 | 14.6017 | 3 | [382, 316] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_232442__809.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5381 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_232455__192 | 0 | 0.0 | 13.3863 | 0 | [382, 286] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_232455__192.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5382 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_232511__446 | 2 | 0.0 | 15.7356 | 3 | [382, 344] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_232511__446.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5383 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_232303__851 | 0 | 0.0 | 8.67932 | 0 | [380, 169] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_232303__851.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5384 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_232317__409 | 0 | 0.0 | 13.453 | 0 | [380, 288] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_232317__409.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5385 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_232332__399 | 0 | 0.0 | 15.1223 | 0 | [380, 329] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_232332__399.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5386 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_232348__329 | 0 | 0.0 | 15.7704 | 0 | [380, 345] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_232348__329.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5387 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_232357__222 | 5 | 0.0 | 9.0444 | 3 | [380, 178] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_232357__222.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5388 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231226_121307__265 | 0 | 0.0 | 16.6272 | 0 | [71, 296] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_121307__265.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5389 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231226_121322__414 | 0 | 0.0 | 14.9901 | 0 | [71, 270] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_121322__414.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5390 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_121236__331 | 0 | 0.0 | 12.4128 | 0 | [74, 223] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_121236__331.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5391 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_121250__352 | 5 | 0.0 | 13.9767 | 3 | [74, 251] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_121250__352.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5392 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_234006__823 | 5 | 0.0 | 10.0083 | 3 | [74, 180] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_234006__823.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5393 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_121220__353 | 0 | 0.0 | 3.30762 | 0 | [115, 48] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_121220__353.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5394 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_121224__747 | 5 | 0.0 | 3.38196 | 3 | [115, 50] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_121224__747.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5395 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_233956__467 | 5 | 0.0 | 3.13946 | 3 | [115, 46] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_233956__467.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5396 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_121211__300 | 0 | 0.0 | 15.1551 | 0 | [191, 261] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_121211__300.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5397 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_121217__570 | 5 | 0.0 | 5.71459 | 3 | [191, 83] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_121217__570.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5398 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_233953__846 | 0 | 0.0 | 18.9278 | 0 | [191, 168] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_233953__846.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5399 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_121417__712 | 5 | 0.0 | 21.4943 | 3 | [382, 353] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_121417__712.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5400 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_121438__409 | 4 | 0.0 | 20.8479 | 3 | [382, 346] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_121438__409.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5401 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_234046__624 | 5 | 0.0 | 19.7953 | 3 | [382, 326] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_234046__624.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5402 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_121340__510 | 5 | 0.0 | 18.3095 | 3 | [380, 296] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_121340__510.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5403 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_121355__819 | 5 | 0.0 | 14.8983 | 3 | [380, 230] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_121355__819.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5404 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_234026__169 | 5 | 0.0 | 19.6768 | 3 | [380, 324] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_234026__169.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5405 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_111808__914 | 3 | 0.0 | 26.2935 | 2 | [78, 149] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_111808__914.json | 81.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5406 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_111839__784 | 5 | 0.0 | 30.8272 | 3 | [78, 177] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_111839__784.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5407 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_111904__485 | 4 | 0.0 | 24.9915 | 3 | [78, 141] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_111904__485.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5408 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_145555__279 | 0 | 0.0 | 26.8863 | 0 | [78, 152] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_145555__279.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5409 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_145617__434 | 4 | 0.0 | 21.8552 | 2 | [78, 121] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_145617__434.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5410 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_111712__195 | 5 | 0.0 | 42.2076 | 3 | [117, 241] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_111712__195.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5411 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_111733__554 | 0 | 0.0 | 21.4569 | 0 | [117, 114] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_111733__554.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5412 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_111742__354 | 0 | 0.0 | 8.8638 | 0 | [117, 36] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_111742__354.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5413 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_145441__609 | 0 | 0.0 | 35.4635 | 3 | [117, 199] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_145441__609.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5414 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_145528__981 | 5 | 0.0 | 47.6715 | 3 | [117, 273] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_145528__981.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5415 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_111502__890 | 5 | 0.0 | 39.9599 | 3 | [192, 191] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_111502__890.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5416 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_111546__606 | 4 | 0.0 | 44.202 | 2 | [192, 242] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_111546__606.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5417 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_111629__602 | 0 | 0.0 | 43.0337 | 0 | [192, 235] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_111629__602.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5418 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_145329__164 | 4 | 0.0 | 52.393 | 3 | [192, 290] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_145329__164.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5419 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_145405__830 | 0 | 0.0 | 35.9153 | 0 | [192, 191] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_145405__830.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5420 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_112219__355 | 3 | 0.0 | 38.5577 | 3 | [391, 170] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_112219__355.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5421 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_112319__830 | 5 | 0.0 | 60.1815 | 3 | [391, 298] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_112319__830.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5422 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_112330__366 | 0 | 0.0 | 10.9227 | 0 | [391, 4] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_112330__366.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5423 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_145819__730 | 0 | 0.0 | 11.0575 | 0 | [391, 4] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_145819__730.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5424 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_145830__312 | 0 | 0.0 | 11.2011 | 0 | [391, 5] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_145830__312.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5425 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_111948__381 | 1 | 0.0 | 44.0952 | 3 | [389, 203] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_111948__381.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5426 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_112048__184 | 5 | 0.0 | 59.6211 | 3 | [389, 295] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_112048__184.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5427 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_112140__250 | 5 | 0.0 | 52.1442 | 3 | [389, 251] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_112140__250.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5428 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_145710__196 | 5 | 0.0 | 52.2777 | 3 | [389, 250] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_145710__196.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5429 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_145808__426 | 0 | 0.0 | 58.4041 | 3 | [389, 286] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_145808__426.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5430 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_231038__610 | 0 | 0.0 | 5.70154 | 0 | [62, 171] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231219_231038__610.json | 0.0 | missing | missing | missing | |
| 5431 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_231043__704 | 0 | 0.0 | 5.74635 | 0 | [1, 185] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231219_231043__704.json | 0.0 | missing | missing | missing | |
| 5432 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_231049__538 | 0 | 0.0 | 5.93262 | 0 | [1, 191] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231219_231049__538.json | 0.0 | missing | missing | missing | |
| 5433 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_231006__469 | 0 | 0.0 | 11.1476 | 0 | [79, 332] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_231006__469.json | 0.0 | missing | missing | missing | |
| 5434 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231219_231018__222 | 0 | 0.0 | 11.5963 | 0 | [1, 362] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_231018__222.json | 50.0 | missing | missing | missing | |
| 5435 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231219_231032__436 | 0 | 0.0 | 14.2971 | 0 | [1, 440] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_231032__436.json | 50.0 | missing | missing | missing | |
| 5436 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231226_233821__432 | 4 | 0.0 | 7.89372 | 2 | [83, 193] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231226_233821__432.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5437 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_230940__699 | 0 | 0.0 | 5.81375 | 0 | [108, 161] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_230940__699.json | 50.0 | missing | missing | missing | |
| 5438 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_230949__170 | 0 | 0.0 | 8.27826 | 0 | [1, 259] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_230949__170.json | 50.0 | missing | missing | missing | |
| 5439 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_230955__210 | 0 | 0.0 | 6.21679 | 0 | [1, 197] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_230955__210.json | 50.0 | missing | missing | missing | |
| 5440 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_233813__965 | 0 | 0.0 | 3.24242 | 0 | [124, 67] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_233813__965.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5441 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231219_230906__477 | 0 | 0.0 | 15.779 | 0 | [184, 428] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_230906__477.json | 50.0 | missing | missing | missing | |
| 5442 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_230925__665 | 0 | 0.0 | 18.1869 | 0 | [1, 530] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_230925__665.json | 0.0 | missing | missing | missing | |
| 5443 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231219_230934__714 | 0 | 0.0 | 9.86972 | 0 | [1, 299] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_230934__714.json | 50.0 | missing | missing | missing | |
| 5444 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_233810__931 | 0 | 0.0 | 13.6231 | 0 | [200, 169] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_233810__931.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5445 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_231158__313 | 0 | 0.0 | 20.711 | 0 | [11, 560] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_231158__313.json | 0.0 | missing | missing | missing | |
| 5446 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_231207__356 | 0 | 0.0 | 9.01409 | 0 | [1, 258] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_231207__356.json | 50.0 | missing | missing | missing | |
| 5447 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_231220__190 | 0 | 0.0 | 12.775 | 0 | [1, 360] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_231220__190.json | 0.0 | missing | missing | missing | |
| 5448 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_233846__194 | 0 | 0.0 | 11.8528 | 0 | [391, 245] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_233846__194.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5449 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_231105__542 | 0 | 0.0 | 15.5912 | 0 | [379, 344] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_231105__542.json | 50.0 | missing | missing | missing | |
| 5450 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_231124__943 | 0 | 0.0 | 18.7382 | 0 | [1, 516] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_231124__943.json | 0.0 | missing | missing | missing | |
| 5451 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_231138__356 | 0 | 0.0 | 13.7783 | 3 | [1, 387] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_231138__356.json | 75.0 | missing | missing | missing | |
| 5452 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_233834__987 | 0 | 0.0 | 12.6591 | 0 | [389, 265] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_233834__987.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5453 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231214_002817__915 | 0 | 0.0 | 6.25136 | 0 | [62, 187] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231214_002817__915.json | 0.0 | missing | missing | missing | |
| 5454 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231225_024836__787 | 0 | 0.0 | 5.38275 | 0 | [78, 167] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231225_024836__787.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5455 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231225_024842__753 | 0 | 0.0 | 6.109 | 0 | [78, 191] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231225_024842__753.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5456 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231214_002811__939 | 0 | 0.0 | 10.5197 | 0 | [79, 312] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231214_002811__939.json | 50.0 | missing | missing | missing | |
| 5457 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | InJulia | 1SHOT | false | false | 5 | 20231225_024825__632 | 0 | 0.0 | 1.84122 | 0 | [81, 47] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_024825__632.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5458 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231225_024831__990 | 0 | 0.0 | 5.55964 | 0 | [81, 173] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_024831__990.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5459 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231226_232153__280 | 5 | 0.0 | 2.36119 | 3 | [81, 65] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231226_232153__280.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5460 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_002800__647 | 0 | 0.0 | 5.63532 | 0 | [108, 154] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231214_002800__647.json | 50.0 | missing | missing | missing | |
| 5461 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_024820__739 | 0 | 0.0 | 5.76361 | 0 | [122, 174] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_024820__739.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5462 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_024823__466 | 0 | 0.0 | 3.45226 | 0 | [122, 97] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_024823__466.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5463 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_232151__213 | 0 | 0.0 | 2.51841 | 0 | [122, 65] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231226_232151__213.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5464 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_002754__199 | 0 | 0.0 | 14.3729 | 0 | [184, 387] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231214_002754__199.json | 0.0 | missing | missing | missing | |
| 5465 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_024807__475 | 5 | 0.0 | 11.9194 | 3 | [198, 184] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_024807__475.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5466 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_024814__140 | 0 | 0.0 | 7.2822 | 0 | [198, 209] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_024814__140.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5467 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_232148__343 | 4 | 0.0 | 13.1667 | 2 | [198, 239] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231226_232148__343.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5468 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_002854__413 | 0 | 0.0 | 17.6585 | 0 | [11, 482] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231214_002854__413.json | 50.0 | missing | missing | missing | |
| 5469 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_024915__243 | 0 | 0.0 | 7.48489 | 0 | [389, 182] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_024915__243.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5470 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_024922__291 | 5 | 0.0 | 7.65594 | 3 | [389, 188] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_024922__291.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5471 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_232207__958 | 0 | 0.0 | 3.99228 | 0 | [389, 70] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231226_232207__958.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5472 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_002837__745 | 0 | 0.0 | 19.7793 | 0 | [379, 453] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231214_002837__745.json | 50.0 | missing | missing | missing | |
| 5473 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_024852__190 | 5 | 0.0 | 9.98389 | 3 | [387, 262] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_024852__190.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5474 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_024907__166 | 0 | 0.0 | 14.6256 | 3 | [387, 406] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_024907__166.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5475 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_232203__523 | 5 | 0.0 | 9.03081 | 3 | [387, 232] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231226_232203__523.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5476 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231214_003644__907 | 0 | 0.0 | 6.72952 | 0 | [62, 202] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__AsIs__1SHOT__20231214_003644__907.json | 0.0 | missing | missing | missing | |
| 5477 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231225_030530__449 | 0 | 0.0 | 10.9448 | 0 | [79, 193] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__AsIs__1SHOT__20231225_030530__449.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5478 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231225_030536__988 | 0 | 0.0 | 6.52823 | 0 | [79, 110] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__AsIs__1SHOT__20231225_030536__988.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5479 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231214_003637__787 | 0 | 0.0 | 13.598 | 0 | [79, 403] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__InJulia__1SHOT__20231214_003637__787.json | 50.0 | missing | missing | missing | |
| 5480 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231225_030513__271 | 0 | 0.0 | 8.10921 | 0 | [82, 139] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__InJulia__1SHOT__20231225_030513__271.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5481 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_030519__543 | 0 | 0.0 | 5.92723 | 0 | [82, 98] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__InJulia__1SHOT__20231225_030519__543.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5482 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231226_232909__978 | 0 | 0.0 | 8.1473 | 0 | [82, 141] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__InJulia__1SHOT__20231226_232909__978.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5483 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_003623__658 | 0 | 0.0 | 6.38548 | 0 | [108, 177] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231214_003623__658.json | 50.0 | missing | missing | missing | |
| 5484 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_030500__372 | 0 | 0.0 | 5.10159 | 0 | [121, 78] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_030500__372.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5485 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_030504__294 | 0 | 0.0 | 4.8068 | 0 | [121, 72] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_030504__294.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5486 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_232900__598 | 0 | 0.0 | 4.65808 | 0 | [121, 69] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231226_232900__598.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5487 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_003617__309 | 0 | 0.0 | 9.70911 | 0 | [184, 254] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231214_003617__309.json | 50.0 | missing | missing | missing | |
| 5488 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_030439__342 | 0 | 0.0 | 24.681 | 0 | [197, 249] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_030439__342.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5489 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_030454__763 | 0 | 0.0 | 15.1604 | 0 | [197, 250] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_030454__763.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5490 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_232856__632 | 0 | 0.0 | 23.656 | 0 | [197, 243] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231226_232856__632.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5491 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_003714__301 | 0 | 0.0 | 11.3702 | 0 | [11, 316] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231214_003714__301.json | 50.0 | missing | missing | missing | |
| 5492 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_030642__307 | 0 | 0.0 | 7.67872 | 0 | [385, 80] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_030642__307.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5493 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_030702__452 | 0 | 0.0 | 19.8975 | 0 | [385, 299] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_030702__452.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5494 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_232952__454 | 0 | 0.0 | 25.4567 | 0 | [385, 398] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231226_232952__454.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5495 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_003702__811 | 0 | 0.0 | 18.4149 | 0 | [379, 418] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231214_003702__811.json | 50.0 | missing | missing | missing | |
| 5496 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_030614__629 | 0 | 0.0 | 38.1891 | 0 | [382, 617] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_030614__629.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5497 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_030634__451 | 0 | 0.0 | 19.8591 | 0 | [382, 303] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_030634__451.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5498 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_232927__155 | 0 | 0.0 | 18.0361 | 0 | [382, 272] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231226_232927__155.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5499 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231219_231358__676 | 0 | 0.0 | 6.23993 | 0 | [62, 188] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231219_231358__676.json | 0.0 | missing | missing | missing | |
| 5500 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231219_231404__620 | 0 | 0.0 | 6.54473 | 0 | [1, 210] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231219_231404__620.json | 0.0 | missing | missing | missing | |
| 5501 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231219_231410__397 | 0 | 0.0 | 5.80284 | 0 | [1, 187] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231219_231410__397.json | 0.0 | missing | missing | missing | |
| 5502 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231219_231330__537 | 0 | 0.0 | 10.3275 | 0 | [79, 308] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_231330__537.json | 0.0 | missing | missing | missing | |
| 5503 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231219_231341__964 | 0 | 0.0 | 10.9216 | 0 | [1, 342] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_231341__964.json | 0.0 | missing | missing | missing | |
| 5504 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231219_231352__567 | 0 | 0.0 | 11.0346 | 0 | [1, 345] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_231352__567.json | 50.0 | missing | missing | missing | |
| 5505 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231226_233923__720 | 0 | 0.0 | 13.2979 | 0 | [72, 508] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231226_233923__720.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5506 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_231306__356 | 0 | 0.0 | 5.60134 | 0 | [108, 154] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_231306__356.json | 50.0 | missing | missing | missing | |
| 5507 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_231312__182 | 0 | 0.0 | 5.79347 | 0 | [1, 184] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_231312__182.json | 50.0 | missing | missing | missing | |
| 5508 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_231319__169 | 0 | 0.0 | 7.76262 | 0 | [1, 244] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_231319__169.json | 0.0 | missing | missing | missing | |
| 5509 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_233909__939 | 0 | 0.0 | 18.1033 | 0 | [109, 674] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_233909__939.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5510 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231219_231235__422 | 0 | 0.0 | 14.5896 | 3 | [184, 395] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_231235__422.json | 75.0 | missing | missing | missing | |
| 5511 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_231246__574 | 0 | 0.0 | 11.0488 | 0 | [1, 333] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_231246__574.json | 0.0 | missing | missing | missing | |
| 5512 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231219_231300__566 | 0 | 0.0 | 14.3269 | 0 | [1, 425] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_231300__566.json | 50.0 | missing | missing | missing | |
| 5513 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_233851__651 | 0 | 0.0 | 5.23812 | 0 | [183, 49] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_233851__651.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5514 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_231522__349 | 0 | 0.0 | 23.6075 | 0 | [11, 632] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_231522__349.json | 50.0 | missing | missing | missing | |
| 5515 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_231542__423 | 0 | 0.0 | 19.2379 | 0 | [1, 528] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_231542__423.json | 0.0 | missing | missing | missing | |
| 5516 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_231553__595 | 0 | 0.0 | 11.4661 | 0 | [1, 325] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_231553__595.json | 25.0 | missing | missing | missing | |
| 5517 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_233934__924 | 0 | 0.0 | 6.29127 | 0 | [361, 193] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_233934__924.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5518 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_231426__291 | 0 | 0.0 | 15.8491 | 0 | [379, 351] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_231426__291.json | 0.0 | missing | missing | missing | |
| 5519 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_231443__121 | 0 | 0.0 | 16.9624 | 0 | [1, 470] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_231443__121.json | 50.0 | missing | missing | missing | |
| 5520 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_231459__487 | 0 | 0.0 | 15.5666 | 0 | [1, 434] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_231459__487.json | 0.0 | missing | missing | missing | |
| 5521 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_233927__409 | 0 | 0.0 | 4.92657 | 0 | [358, 142] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_233927__409.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5522 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231214_003746__185 | 0 | 0.0 | 5.2891 | 0 | [62, 157] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231214_003746__185.json | 0.0 | missing | missing | missing | |
| 5523 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231225_030942__903 | 0 | 0.0 | 15.9026 | 0 | [87, 114] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_030942__903.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5524 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231225_030956__905 | 0 | 0.0 | 14.6767 | 0 | [87, 104] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_030956__905.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5525 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | InJulia | 1SHOT | false | false | 5 | 20231214_003741__348 | 0 | 0.0 | 9.23832 | 0 | [79, 274] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231214_003741__348.json | 0.0 | missing | missing | missing | |
| 5526 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_030900__131 | 4 | 0.0 | 25.6567 | 3 | [90, 193] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_030900__131.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5527 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_030926__722 | 3 | 0.0 | 25.1571 | 2 | [90, 189] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_030926__722.json | 81.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5528 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231226_233110__685 | 3 | 0.0 | 20.9688 | 2 | [90, 155] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231226_233110__685.json | 81.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5529 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_003732__450 | 0 | 0.0 | 7.68799 | 0 | [108, 217] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231214_003732__450.json | 0.0 | missing | missing | missing | |
| 5530 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_030811__967 | 4 | 0.0 | 11.2634 | 2 | [129, 66] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_030811__967.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5531 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_030835__854 | 3 | 0.0 | 23.9729 | 2 | [129, 169] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_030835__854.json | 81.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5532 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_233049__599 | 4 | 0.0 | 14.0142 | 3 | [129, 89] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231226_233049__599.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5533 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_003724__331 | 0 | 0.0 | 10.2001 | 0 | [184, 268] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_003724__331.json | 0.0 | missing | missing | missing | |
| 5534 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_030737__820 | 0 | 0.0 | 35.1258 | 0 | [205, 65] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_030737__820.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5535 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_030759__107 | 3 | 0.0 | 22.4025 | 2 | [205, 145] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_030759__107.json | 81.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5536 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_233035__210 | 4 | 0.0 | 42.9161 | 2 | [205, 142] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_233035__210.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5537 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_003821__681 | 0 | 0.0 | 18.4429 | 0 | [11, 501] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_003821__681.json | 50.0 | missing | missing | missing | |
| 5538 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_031118__815 | 4 | 0.0 | 28.1766 | 3 | [393, 156] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_031118__815.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5539 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_031144__173 | 5 | 0.0 | 26.0046 | 3 | [393, 139] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_031144__173.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5540 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_233205__706 | 4 | 0.0 | 34.5579 | 3 | [393, 207] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_233205__706.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5541 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_003802__180 | 0 | 0.0 | 15.9389 | 0 | [379, 352] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231214_003802__180.json | 50.0 | missing | missing | missing | |
| 5542 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_031033__841 | 4 | 0.0 | 36.4428 | 3 | [390, 222] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_031033__841.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5543 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_031050__671 | 5 | 0.0 | 16.7406 | 3 | [390, 67] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_031050__671.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5544 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_233130__191 | 5 | 0.0 | 19.9364 | 3 | [390, 93] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231226_233130__191.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5545 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_230326__998 | 0 | 0.0 | 11.6779 | 0 | [62, 353] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231219_230326__998.json | 0.0 | missing | missing | missing | |
| 5546 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_230331__135 | 0 | 0.0 | 5.54744 | 0 | [1, 179] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231219_230331__135.json | 0.0 | missing | missing | missing | |
| 5547 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_230337__392 | 0 | 0.0 | 6.036 | 0 | [1, 194] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231219_230337__392.json | 0.0 | missing | missing | missing | |
| 5548 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231219_230300__278 | 0 | 0.0 | 13.1957 | 0 | [79, 393] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_230300__278.json | 50.0 | missing | missing | missing | |
| 5549 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_230309__551 | 0 | 0.0 | 8.84277 | 0 | [1, 280] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_230309__551.json | 0.0 | missing | missing | missing | |
| 5550 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231219_230314__755 | 0 | 0.0 | 4.9831 | 0 | [1, 161] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_230314__755.json | 50.0 | missing | missing | missing | |
| 5551 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231226_233630__536 | 0 | 0.0 | 12.1326 | 0 | [83, 201] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231226_233630__536.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5552 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_230236__890 | 0 | 0.0 | 6.94868 | 0 | [108, 196] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_230236__890.json | 50.0 | missing | missing | missing | |
| 5553 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_230242__723 | 0 | 0.0 | 5.15585 | 0 | [1, 164] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_230242__723.json | 50.0 | missing | missing | missing | |
| 5554 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_230247__387 | 0 | 0.0 | 5.50709 | 0 | [1, 175] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_230247__387.json | 50.0 | missing | missing | missing | |
| 5555 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_233618__651 | 0 | 0.0 | 11.6866 | 0 | [124, 188] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_233618__651.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5556 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_230207__354 | 0 | 0.0 | 7.99854 | 0 | [184, 205] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_230207__354.json | 0.0 | missing | missing | missing | |
| 5557 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_230219__564 | 0 | 0.0 | 11.7534 | 0 | [1, 353] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_230219__564.json | 0.0 | missing | missing | missing | |
| 5558 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231219_230229__895 | 0 | 0.0 | 10.7059 | 0 | [1, 323] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_230229__895.json | 50.0 | missing | missing | missing | |
| 5559 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_031849__866 | 0 | 0.0 | 23.2862 | 0 | [200, 214] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_031849__866.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5560 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_233606__412 | 0 | 0.0 | 25.2807 | 0 | [200, 257] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_233606__412.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5561 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_230450__188 | 0 | 0.0 | 22.8973 | 0 | [11, 615] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_230450__188.json | 50.0 | missing | missing | missing | |
| 5562 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_230504__471 | 0 | 0.0 | 14.84 | 3 | [1, 415] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_230504__471.json | 75.0 | missing | missing | missing | |
| 5563 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_230525__192 | 0 | 0.0 | 20.9784 | 0 | [1, 572] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_230525__192.json | 50.0 | missing | missing | missing | |
| 5564 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_233702__178 | 0 | 0.0 | 14.3674 | 0 | [391, 191] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_233702__178.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5565 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_230357__500 | 0 | 0.0 | 19.0357 | 0 | [379, 435] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_230357__500.json | 0.0 | missing | missing | missing | |
| 5566 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_230416__994 | 0 | 0.0 | 19.4844 | 0 | [1, 535] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_230416__994.json | 0.0 | missing | missing | missing | |
| 5567 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_230427__748 | 0 | 0.0 | 10.6106 | 0 | [1, 302] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_230427__748.json | 50.0 | missing | missing | missing | |
| 5568 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_233647__382 | 0 | 0.0 | 16.722 | 0 | [389, 231] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_233647__382.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5569 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231214_003533__249 | 0 | 0.0 | 5.5997 | 0 | [62, 167] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__AsIs__1SHOT__20231214_003533__249.json | 0.0 | missing | missing | missing | |
| 5570 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231225_030349__715 | 0 | 0.0 | 3.98311 | 0 | [80, 225] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_030349__715.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5571 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231225_030353__307 | 0 | 0.0 | 3.98346 | 0 | [80, 225] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_030353__307.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5572 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231214_003527__690 | 0 | 0.0 | 12.4482 | 0 | [79, 370] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__InJulia__1SHOT__20231214_003527__690.json | 0.0 | missing | missing | missing | |
| 5573 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231225_030342__432 | 0 | 0.0 | 4.32645 | 0 | [83, 245] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_030342__432.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5574 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | InJulia | 1SHOT | true | true | 5 | 20231225_030345__982 | 0 | 0.0 | 3.56945 | 0 | [83, 201] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_030345__982.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5575 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231226_232824__354 | 0 | 0.0 | 2.95855 | 0 | [83, 165] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__InJulia__1SHOT__20231226_232824__354.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5576 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_003515__389 | 0 | 0.0 | 8.4558 | 0 | [108, 240] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231214_003515__389.json | 0.0 | missing | missing | missing | |
| 5577 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_030335__599 | 0 | 0.0 | 1.35695 | 0 | [120, 65] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_030335__599.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5578 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_030337__991 | 0 | 0.0 | 2.05819 | 0 | [120, 106] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_030337__991.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5579 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_232821__288 | 0 | 0.0 | 1.9188 | 0 | [120, 98] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231226_232821__288.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5580 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_003506__618 | 0 | 0.0 | 16.7829 | 0 | [184, 454] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231214_003506__618.json | 0.0 | missing | missing | missing | |
| 5581 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_030330__125 | 0 | 0.0 | 7.49792 | 0 | [192, 240] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_030330__125.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5582 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_030334__364 | 0 | 0.0 | 4.26658 | 3 | [192, 220] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_030334__364.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5583 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_232819__499 | 0 | 0.0 | 5.26938 | 0 | [192, 123] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231226_232819__499.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5584 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_003607__717 | 0 | 0.0 | 17.0085 | 0 | [11, 464] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231214_003607__717.json | 50.0 | missing | missing | missing | |
| 5585 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_030409__270 | 0 | 0.0 | 7.32203 | 0 | [370, 336] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_030409__270.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5586 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_030415__860 | 0 | 0.0 | 5.9128 | 0 | [370, 263] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_030415__860.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5587 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_232832__823 | 0 | 0.0 | 3.74105 | 3 | [370, 148] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231226_232832__823.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5588 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_003550__880 | 0 | 0.0 | 17.337 | 0 | [379, 389] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231214_003550__880.json | 50.0 | missing | missing | missing | |
| 5589 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_030357__473 | 0 | 0.0 | 3.67284 | 0 | [368, 144] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_030357__473.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5590 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_030401__575 | 0 | 0.0 | 3.93186 | 0 | [368, 159] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_030401__575.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5591 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_232828__899 | 0 | 0.0 | 4.71995 | 0 | [368, 200] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231226_232828__899.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5592 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231214_002936__342 | 0 | 0.0 | 13.2979 | 0 | [62, 399] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__AsIs__1SHOT__20231214_002936__342.json | 0.0 | missing | missing | missing | |
| 5593 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231225_025022__804 | 0 | 0.0 | 6.12278 | 0 | [80, 191] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__AsIs__1SHOT__20231225_025022__804.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5594 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231225_025030__971 | 0 | 0.0 | 7.88465 | 0 | [80, 250] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__AsIs__1SHOT__20231225_025030__971.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5595 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231214_002922__138 | 4 | 0.0 | 9.71859 | 3 | [79, 289] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__InJulia__1SHOT__20231214_002922__138.json | 95.0 | missing | missing | missing | |
| 5596 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_025011__965 | 0 | 0.0 | 6.23612 | 0 | [83, 194] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_025011__965.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5597 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_025016__463 | 0 | 0.0 | 4.9652 | 0 | [83, 152] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_025016__463.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5598 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231226_232234__649 | 5 | 0.0 | 4.88108 | 3 | [83, 150] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__InJulia__1SHOT__20231226_232234__649.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5599 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_002913__461 | 0 | 0.0 | 5.60071 | 0 | [108, 153] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231214_002913__461.json | 50.0 | missing | missing | missing | |
| 5600 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_024958__389 | 0 | 0.0 | 4.60681 | 0 | [124, 135] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_024958__389.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5601 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_025005__574 | 0 | 0.0 | 6.25347 | 0 | [124, 190] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_025005__574.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5602 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_232229__985 | 0 | 0.0 | 5.13547 | 0 | [124, 153] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231226_232229__985.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5603 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_002907__410 | 0 | 0.0 | 12.4337 | 0 | [184, 333] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231214_002907__410.json | 50.0 | missing | missing | missing | |
| 5604 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_024944__237 | 0 | 0.0 | 21.0784 | 0 | [200, 472] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_024944__237.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5605 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_024954__438 | 2 | 0.0 | 9.89302 | 3 | [200, 292] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_024954__438.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5606 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_232223__113 | 0 | 0.0 | 16.7775 | 0 | [200, 350] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231226_232223__113.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5607 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_003014__327 | 0 | 0.0 | 20.3963 | 0 | [11, 551] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231214_003014__327.json | 0.0 | missing | missing | missing | |
| 5608 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_025051__587 | 5 | 0.0 | 7.85462 | 3 | [391, 194] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_025051__587.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5609 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_025059__745 | 0 | 0.0 | 8.63762 | 0 | [391, 218] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_025059__745.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5610 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_232301__241 | 0 | 0.0 | 14.0533 | 0 | [391, 389] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231226_232301__241.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5611 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_002954__785 | 0 | 0.0 | 17.8291 | 0 | [379, 402] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231214_002954__785.json | 50.0 | missing | missing | missing | |
| 5612 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_025037__410 | 0 | 0.0 | 6.58774 | 0 | [389, 153] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_025037__410.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5613 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_025043__807 | 0 | 0.0 | 5.64814 | 0 | [389, 123] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_025043__807.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5614 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_232247__844 | 0 | 0.0 | 12.9514 | 0 | [389, 355] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231226_232247__844.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5615 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231214_003052__499 | 0 | 0.0 | 6.31628 | 0 | [62, 190] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__AsIs__1SHOT__20231214_003052__499.json | 0.0 | missing | missing | missing | |
| 5616 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231225_025426__259 | 0 | 0.0 | 44.892 | 0 | [75, 336] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__AsIs__1SHOT__20231225_025426__259.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5617 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231225_025458__434 | 0 | 0.0 | 32.2163 | 0 | [75, 238] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__AsIs__1SHOT__20231225_025458__434.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5618 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | InJulia | 1SHOT | false | false | 5 | 20231214_003045__284 | 0 | 0.0 | 10.2376 | 0 | [79, 304] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__InJulia__1SHOT__20231214_003045__284.json | 0.0 | missing | missing | missing | |
| 5619 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231225_025300__758 | 5 | 0.0 | 40.4818 | 3 | [78, 302] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_025300__758.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5620 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231225_025341__510 | 5 | 0.0 | 40.8867 | 3 | [78, 305] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_025341__510.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5621 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231226_232440__994 | 0 | 0.0 | 48.4699 | 2 | [78, 365] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__InJulia__1SHOT__20231226_232440__994.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5622 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_003035__707 | 0 | 0.0 | 5.2328 | 0 | [108, 142] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231214_003035__707.json | 0.0 | missing | missing | missing | |
| 5623 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_025157__364 | 0 | 0.0 | 8.5765 | 0 | [117, 47] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_025157__364.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5624 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_025219__831 | 0 | 0.0 | 21.7883 | 0 | [117, 151] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_025219__831.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5625 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_232352__814 | 0 | 0.0 | 7.6409 | 3 | [117, 40] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231226_232352__814.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5626 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_003030__126 | 0 | 0.0 | 15.7708 | 0 | [184, 427] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231214_003030__126.json | 50.0 | missing | missing | missing | |
| 5627 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_025136__780 | 0 | 0.0 | 36.6313 | 3 | [192, 65] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_025136__780.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5628 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_025149__717 | 0 | 0.0 | 12.572 | 0 | [192, 68] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_025149__717.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5629 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_232344__584 | 5 | 0.0 | 43.1168 | 3 | [192, 132] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231226_232344__584.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5630 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_003122__420 | 0 | 0.0 | 18.5528 | 0 | [11, 504] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231214_003122__420.json | 0.0 | missing | missing | missing | |
| 5631 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_025713__774 | 5 | 0.0 | 64.9576 | 3 | [391, 422] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_025713__774.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5632 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_025832__722 | 0 | 0.0 | 78.7542 | 0 | [391, 522] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_025832__722.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5633 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_232632__423 | 0 | 0.0 | 62.6121 | 3 | [391, 407] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231226_232632__423.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5634 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_003103__187 | 0 | 0.0 | 11.4866 | 0 | [379, 231] | 0.10.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231214_003103__187.json | 25.0 | missing | missing | missing | |
| 5635 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_025534__415 | 5 | 0.0 | 36.3488 | 3 | [389, 211] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_025534__415.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5636 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_025608__637 | 0 | 0.0 | 33.4201 | 0 | [389, 189] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_025608__637.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5637 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_232529__967 | 5 | 0.0 | 49.2928 | 3 | [389, 309] | 0.10.0-DEV | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231226_232529__967.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5638 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231214_004641__986 | 0 | 0.0 | 14.0165 | 0 | [107, 403] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231214_004641__986.json | 0.0 | missing | missing | missing | |
| 5639 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231225_171450__345 | 0 | 0.0 | 23.5326 | 0 | [129, 420] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_171450__345.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5640 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231225_171512__553 | 0 | 0.0 | 22.1094 | 0 | [129, 394] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_171512__553.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5641 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231214_004626__458 | 1 | 0.0 | 15.8863 | 1 | [124, 453] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231214_004626__458.json | 60.0 | missing | missing | missing | |
| 5642 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_171358__906 | 1 | 0.0 | 14.8461 | 1 | [132, 259] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_171358__906.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5643 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_171426__563 | 0 | 0.0 | 27.7913 | 0 | [132, 497] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_171426__563.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5644 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | InJulia | 1SHOT | true | false | 5 | 20231226_235127__501 | 0 | 0.0 | 14.9984 | 0 | [132, 260] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231226_235127__501.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5645 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_004610__809 | 1 | 0.0 | 14.227 | 1 | [153, 395] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231214_004610__809.json | 60.0 | missing | missing | missing | |
| 5646 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_171330__504 | 0 | 0.0 | 8.51782 | 0 | [170, 134] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_171330__504.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5647 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_171343__472 | 4 | 0.0 | 12.6566 | 5 | [170, 212] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_171343__472.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5648 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_235112__303 | 0 | 0.0 | 16.6936 | 0 | [170, 286] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231226_235112__303.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5649 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_004556__249 | 0 | 0.0 | 19.7405 | 0 | [300, 485] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231214_004556__249.json | 0.0 | missing | missing | missing | |
| 5650 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_171302__929 | 3 | 0.0 | 23.8435 | 5 | [318, 214] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_171302__929.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5651 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_171322__533 | 0 | 0.0 | 19.432 | 0 | [318, 312] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_171322__533.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5652 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_235055__384 | 1 | 0.0 | 26.2682 | 1 | [318, 264] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231226_235055__384.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5653 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_004703__540 | 0 | 0.0 | 7.64938 | 0 | [11, 210] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231214_004703__540.json | 0.0 | missing | missing | missing | |
| 5654 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_171635__893 | 0 | 0.0 | 36.2926 | 0 | [435, 580] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_171635__893.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5655 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_171701__848 | 0 | 0.0 | 25.7624 | 0 | [435, 399] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_171701__848.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5656 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_235219__988 | 0 | 0.0 | 22.6399 | 0 | [435, 343] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231226_235219__988.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5657 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_004656__697 | 0 | 0.0 | 15.0089 | 0 | [424, 305] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231214_004656__697.json | 0.0 | missing | missing | missing | |
| 5658 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_171540__588 | 1 | 0.0 | 28.3116 | 1 | [432, 444] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_171540__588.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5659 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_171558__720 | 5 | 0.0 | 18.3214 | 5 | [432, 268] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_171558__720.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5660 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_235157__144 | 0 | 0.0 | 30.0682 | 0 | [432, 472] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231226_235157__144.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5661 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_004320__328 | 1 | 0.0 | 3.08104 | 1 | [0, 232] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_004320__328.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5662 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_004324__530 | 1 | 0.0 | 3.06552 | 1 | [0, 231] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_004324__530.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5663 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_004327__554 | 0 | 0.0 | 3.66195 | 0 | [0, 262] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_004327__554.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5664 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_004333__876 | 0 | 0.0 | 5.432 | 0 | [0, 396] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_004333__876.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5665 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_004342__511 | 0 | 0.0 | 9.04044 | 0 | [0, 656] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_004342__511.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5666 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_004236__363 | 0 | 0.0 | 5.55187 | 0 | [0, 404] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_004236__363.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5667 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_004240__594 | 0 | 0.0 | 3.48983 | 0 | [0, 256] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_004240__594.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5668 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_004243__556 | 5 | 0.0 | 3.2857 | 5 | [0, 241] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_004243__556.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5669 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_004247__399 | 0 | 0.0 | 3.23447 | 0 | [0, 237] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_004247__399.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5670 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_004250__525 | 2 | 0.0 | 2.9853 | 4 | [0, 219] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_004250__525.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5671 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_004159__697 | 3 | 0.0 | 3.84987 | 5 | [0, 279] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_004159__697.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5672 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_004204__381 | 1 | 0.0 | 5.63279 | 1 | [0, 405] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_004204__381.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5673 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_004208__715 | 0 | 0.0 | 3.15074 | 0 | [0, 228] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_004208__715.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5674 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_004213__772 | 0 | 0.0 | 5.28525 | 0 | [0, 386] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_004213__772.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5675 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_004217__490 | 1 | 0.0 | 3.46322 | 1 | [0, 255] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_004217__490.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5676 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_004451__785 | 0 | 0.0 | 5.95534 | 0 | [0, 423] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_004451__785.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5677 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_004457__728 | 2 | 0.0 | 6.68024 | 5 | [0, 473] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_004457__728.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5678 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_004500__335 | 0 | 0.0 | 2.13777 | 0 | [0, 154] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_004500__335.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5679 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_004502__424 | 1 | 0.0 | 1.87784 | 1 | [0, 138] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_004502__424.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5680 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_004504__364 | 0 | 0.0 | 2.46295 | 0 | [0, 181] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_004504__364.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5681 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_004404__512 | 0 | 0.0 | 3.93744 | 0 | [0, 281] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_004404__512.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5682 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_004407__784 | 0 | 0.0 | 2.62511 | 0 | [0, 188] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_004407__784.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5683 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_004411__737 | 0 | 0.0 | 3.60787 | 0 | [0, 258] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_004411__737.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5684 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_004416__594 | 0 | 0.0 | 5.30251 | 0 | [0, 378] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_004416__594.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5685 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_004420__591 | 1 | 0.0 | 4.06357 | 1 | [0, 290] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_004420__591.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5686 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231214_004825__436 | 0 | 0.0 | 21.7021 | 0 | [107, 615] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__AsIs__1SHOT__20231214_004825__436.json | 0.0 | missing | missing | missing | |
| 5687 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231214_004803__561 | 0 | 0.0 | 14.747 | 0 | [124, 422] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__InJulia__1SHOT__20231214_004803__561.json | 0.0 | missing | missing | missing | |
| 5688 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_171750__214 | 0 | 0.0 | 8.59846 | 0 | [106, 146] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_171750__214.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5689 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_171809__475 | 0 | 0.0 | 19.3163 | 0 | [106, 348] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_171809__475.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5690 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_004748__480 | 0 | 0.0 | 17.4715 | 0 | [153, 485] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231214_004748__480.json | 0.0 | missing | missing | missing | |
| 5691 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_171733__763 | 0 | 0.0 | 8.12322 | 0 | [107, 137] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_171733__763.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5692 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_171741__337 | 0 | 0.0 | 8.28883 | 0 | [107, 140] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_171741__337.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5693 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_004731__703 | 1 | 0.0 | 27.5178 | 1 | [300, 683] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231214_004731__703.json | 60.0 | missing | missing | missing | |
| 5694 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_171717__401 | 0 | 0.0 | 16.5429 | 0 | [193, 95] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_171717__401.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5695 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_171725__492 | 0 | 0.0 | 7.32047 | 0 | [193, 106] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_171725__492.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5696 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_004935__515 | 0 | 0.0 | 34.5578 | 0 | [11, 880] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231214_004935__515.json | 0.0 | missing | missing | missing | |
| 5697 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_172135__218 | 0 | 0.0 | 1.61038 | 0 | [124, 11] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_172135__218.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5698 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_172137__347 | 0 | 0.0 | 1.55744 | 0 | [124, 10] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_172137__347.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5699 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_004901__206 | 0 | 0.0 | 35.6118 | 0 | [424, 817] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231214_004901__206.json | 0.0 | missing | missing | missing | |
| 5700 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_172121__763 | 0 | 0.0 | 40.5756 | 0 | [121, 728] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_172121__763.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5701 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_172133__766 | 0 | 0.0 | 12.5963 | 0 | [121, 221] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_172133__766.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5702 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_005144__611 | 0 | 0.0 | 8.86709 | 0 | [0, 315] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_005144__611.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5703 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_005153__985 | 0 | 0.0 | 8.89789 | 0 | [0, 315] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_005153__985.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5704 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_005200__984 | 0 | 0.0 | 6.79082 | 0 | [0, 243] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_005200__984.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5705 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_005211__239 | 0 | 0.0 | 10.8927 | 0 | [0, 391] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_005211__239.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5706 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_005222__978 | 0 | 0.0 | 11.3995 | 0 | [0, 410] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_005222__978.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5707 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_005011__888 | 0 | 0.0 | 4.53026 | 0 | [0, 161] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_005011__888.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5708 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_005022__424 | 0 | 0.0 | 11.3058 | 0 | [0, 400] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_005022__424.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5709 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_005031__320 | 0 | 0.0 | 8.13276 | 0 | [0, 288] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_005031__320.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5710 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_005038__800 | 0 | 0.0 | 7.509 | 0 | [0, 266] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_005038__800.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5711 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_005047__554 | 0 | 0.0 | 8.82738 | 0 | [0, 313] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_005047__554.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5712 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_004849__956 | 0 | 0.0 | 6.68028 | 0 | [0, 239] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_004849__956.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5713 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_004856__353 | 0 | 0.0 | 6.76007 | 0 | [0, 242] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_004856__353.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5714 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_004905__459 | 0 | 0.0 | 9.42861 | 0 | [0, 336] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_004905__459.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5715 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_004915__494 | 4 | 0.0 | 9.82016 | 5 | [0, 350] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_004915__494.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5716 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_004925__744 | 0 | 0.0 | 10.3225 | 0 | [0, 368] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_004925__744.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5717 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_005525__658 | 3 | 0.0 | 12.6107 | 5 | [0, 447] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_005525__658.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5718 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_005533__593 | 0 | 0.0 | 7.51771 | 0 | [0, 267] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_005533__593.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5719 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_005539__506 | 0 | 0.0 | 6.4357 | 0 | [0, 229] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_005539__506.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5720 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_005543__492 | 0 | 0.0 | 3.72913 | 0 | [0, 133] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_005543__492.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5721 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_005548__346 | 0 | 0.0 | 5.10878 | 0 | [0, 182] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_005548__346.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5722 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_005332__538 | 0 | 0.0 | 23.3129 | 0 | [0, 814] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_005332__538.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5723 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_005348__904 | 0 | 0.0 | 16.6905 | 0 | [0, 591] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_005348__904.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5724 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_005411__952 | 0 | 0.0 | 22.2072 | 0 | [0, 778] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_005411__952.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5725 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_005416__618 | 0 | 0.0 | 5.39623 | 0 | [0, 189] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_005416__618.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5726 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_005427__423 | 0 | 0.0 | 10.8316 | 0 | [0, 379] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_005427__423.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5727 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 5 | 20240201_003541__706 | 0 | 0.0 | 10.7201 | 0 | [0, 260] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_003541__706.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5728 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 5 | 20240201_003605__592 | 1 | 0.0 | 23.8436 | 1 | [0, 574] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_003605__592.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5729 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | false | 5 | 20240201_003625__311 | 0 | 0.0 | 20.3748 | 0 | [0, 491] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_003625__311.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5730 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | false | 5 | 20240201_003658__375 | 0 | 0.0 | 33.0037 | 0 | [0, 791] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_003658__375.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5731 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240201_003712__666 | 0 | 0.0 | 13.1114 | 0 | [0, 317] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_003712__666.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5732 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_003344__110 | 0 | 0.0 | 17.8248 | 0 | [0, 435] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_003344__110.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5733 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_003350__996 | 0 | 0.0 | 5.37002 | 0 | [0, 132] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_003350__996.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5734 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_003403__669 | 1 | 0.0 | 13.1875 | 1 | [0, 323] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_003403__669.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5735 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_003412__545 | 0 | 0.0 | 9.38086 | 0 | [0, 230] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_003412__545.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5736 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_003423__242 | 0 | 0.0 | 10.1995 | 0 | [0, 250] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_003423__242.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5737 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_003141__442 | 1 | 0.0 | 12.7991 | 1 | [0, 307] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_003141__442.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5738 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_003146__816 | 0 | 0.0 | 4.89494 | 0 | [0, 118] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_003146__816.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5739 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_003204__594 | 0 | 0.0 | 18.0878 | 0 | [0, 432] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_003204__594.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5740 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_003217__991 | 0 | 0.0 | 13.4753 | 0 | [0, 323] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_003217__991.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5741 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_003233__135 | 4 | 0.0 | 16.1287 | 4 | [0, 386] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_003233__135.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5742 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_004000__472 | 0 | 0.0 | 0.821905 | 0 | [0, 20] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_004000__472.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5743 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_004015__411 | 1 | 0.0 | 14.9634 | 1 | [0, 361] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_004015__411.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5744 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240201_004054__989 | 0 | 0.0 | 39.233 | 0 | [0, 936] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_004054__989.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5745 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_004103__583 | 0 | 0.0 | 8.36894 | 0 | [0, 202] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_004103__583.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5746 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_004124__356 | 1 | 0.0 | 20.9987 | 1 | [0, 503] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_004124__356.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5747 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_003832__461 | 0 | 0.0 | 17.4707 | 0 | [0, 420] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_003832__461.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5748 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_003845__147 | 0 | 0.0 | 13.1908 | 0 | [0, 317] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_003845__147.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5749 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_003857__778 | 0 | 0.0 | 12.6013 | 0 | [0, 303] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_003857__778.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5750 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_003906__920 | 0 | 0.0 | 8.53737 | 0 | [0, 206] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_003906__920.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5751 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_003914__834 | 0 | 0.0 | 7.47579 | 0 | [0, 177] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_003914__834.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5752 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_002117__124 | 0 | 0.0 | 20.2424 | 0 | [0, 378] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_002117__124.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5753 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_002139__800 | 0 | 0.0 | 22.3263 | 0 | [0, 416] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_002139__800.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5754 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_002202__793 | 5 | 0.0 | 23.0066 | 5 | [0, 429] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_002202__793.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5755 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_002230__261 | 0 | 0.0 | 27.2348 | 0 | [0, 507] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_002230__261.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5756 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_002240__782 | 0 | 0.0 | 9.97521 | 0 | [0, 187] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_002240__782.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5757 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_001737__516 | 0 | 0.0 | 17.1888 | 0 | [0, 321] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_001737__516.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5758 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_001754__645 | 0 | 0.0 | 16.3377 | 0 | [0, 305] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_001754__645.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5759 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_001824__398 | 1 | 0.0 | 30.7036 | 1 | [0, 570] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_001824__398.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5760 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_001849__933 | 0 | 0.0 | 23.9735 | 0 | [0, 446] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_001849__933.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5761 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_001858__154 | 0 | 0.0 | 9.52975 | 0 | [0, 178] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_001858__154.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5762 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_001431__235 | 0 | 0.0 | 20.6217 | 0 | [0, 378] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_001431__235.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5763 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_001449__193 | 0 | 0.0 | 16.8255 | 0 | [0, 309] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_001449__193.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5764 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_001503__833 | 0 | 0.0 | 13.8071 | 0 | [0, 254] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_001503__833.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5765 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_001516__182 | 1 | 0.0 | 13.8284 | 1 | [0, 254] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_001516__182.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5766 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_001530__625 | 1 | 0.0 | 13.3349 | 1 | [0, 246] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_001530__625.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5767 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_002747__232 | 0 | 0.0 | 8.27103 | 0 | [0, 153] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_002747__232.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5768 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_002758__802 | 1 | 0.0 | 11.7177 | 1 | [0, 217] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_002758__802.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5769 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_002832__622 | 0 | 0.0 | 33.79 | 0 | [0, 622] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_002832__622.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5770 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_002847__500 | 1 | 0.0 | 14.5063 | 1 | [0, 268] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_002847__500.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5771 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_002920__744 | 0 | 0.0 | 33.2895 | 0 | [0, 613] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_002920__744.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5772 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_002431__964 | 0 | 0.0 | 9.83876 | 0 | [0, 182] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_002431__964.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5773 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_002457__801 | 0 | 0.0 | 25.9539 | 0 | [0, 478] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_002457__801.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5774 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_002526__650 | 0 | 0.0 | 29.2062 | 0 | [0, 538] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_002526__650.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5775 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_002543__147 | 1 | 0.0 | 16.0696 | 1 | [0, 297] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_002543__147.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5776 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_002603__808 | 3 | 0.0 | 20.2547 | 5 | [0, 374] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_002603__808.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5777 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_004613__300 | 4 | 0.0 | 2.30259 | 5 | [0, 265] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_004613__300.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5778 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_004616__675 | 1 | 0.0 | 3.66217 | 1 | [0, 422] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_004616__675.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5779 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_004619__408 | 1 | 0.0 | 2.39684 | 1 | [0, 278] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_004619__408.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5780 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_004622__636 | 0 | 0.0 | 2.76131 | 0 | [0, 319] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_004622__636.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5781 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_004625__512 | 4 | 0.0 | 2.27824 | 4 | [0, 264] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_004625__512.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5782 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_004550__113 | 0 | 0.0 | 3.24614 | 0 | [0, 360] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_004550__113.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5783 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_004553__925 | 0 | 0.0 | 3.17261 | 0 | [0, 359] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_004553__925.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5784 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_004555__800 | 0 | 0.0 | 1.38256 | 0 | [0, 160] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_004555__800.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5785 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_004556__163 | 0 | 0.0 | 1.24206 | 0 | [0, 144] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_004556__163.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5786 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_004557__992 | 0 | 0.0 | 0.965732 | 0 | [0, 112] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_004557__992.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5787 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_004525__782 | 1 | 0.0 | 2.53718 | 1 | [0, 300] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_004525__782.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5788 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_004528__420 | 1 | 0.0 | 2.84691 | 1 | [0, 336] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_004528__420.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5789 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_004531__820 | 1 | 0.0 | 3.56981 | 1 | [0, 419] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_004531__820.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5790 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_004533__489 | 0 | 0.0 | 1.30426 | 0 | [0, 155] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_004533__489.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5791 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_004535__870 | 1 | 0.0 | 2.17755 | 1 | [0, 250] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_004535__870.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5792 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_004714__128 | 0 | 0.0 | 3.47885 | 0 | [0, 404] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_004714__128.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5793 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_004718__213 | 0 | 0.0 | 4.00334 | 0 | [0, 465] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_004718__213.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5794 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_004721__160 | 3 | 0.0 | 3.14116 | 5 | [0, 367] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_004721__160.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5795 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_004724__482 | 1 | 0.0 | 3.46056 | 1 | [0, 401] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_004724__482.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5796 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_004728__888 | 1 | 0.0 | 3.68196 | 1 | [0, 429] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_004728__888.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5797 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_004643__399 | 1 | 0.0 | 2.40285 | 1 | [0, 271] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_004643__399.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5798 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_004647__957 | 0 | 0.0 | 4.21602 | 0 | [0, 473] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_004647__957.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5799 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_004648__499 | 0 | 0.0 | 0.923256 | 0 | [0, 104] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_004648__499.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5800 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_004651__718 | 5 | 0.0 | 2.87222 | 5 | [0, 323] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_004651__718.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5801 | NVIDIA-RTX-4090-4x | event_scheduler | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_004653__763 | 0 | 0.0 | 1.70594 | 0 | [0, 193] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_004653__763.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5802 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_231837__880 | 0 | 0.0 | 19.0658 | 0 | [107, 545] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_231837__880.json | 0.0 | missing | missing | missing | |
| 5803 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_231856__551 | 0 | 0.0 | 19.1587 | 0 | [1, 569] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_231856__551.json | 0.0 | missing | missing | missing | |
| 5804 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_231915__334 | 0 | 0.0 | 18.5171 | 0 | [1, 551] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_231915__334.json | 0.0 | missing | missing | missing | |
| 5805 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | true | true | 5 | 20231225_175150__540 | 1 | 0.0 | 64.5593 | 1 | [119, 379] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_175150__540.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5806 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | true | true | 5 | 20231225_175246__265 | 1 | 0.0 | 54.442 | 1 | [119, 317] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_175246__265.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5807 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_231759__698 | 0 | 0.0 | 19.4511 | 0 | [1, 576] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_231759__698.json | 0.0 | missing | missing | missing | |
| 5808 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231219_231818__233 | 1 | 0.0 | 19.0131 | 1 | [1, 564] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_231818__233.json | 60.0 | missing | missing | missing | |
| 5809 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_174957__875 | 5 | 0.0 | 90.3689 | 5 | [122, 530] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_174957__875.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5810 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_175045__697 | 0 | 0.0 | 47.2121 | 0 | [122, 269] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_175045__697.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5811 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_000329__576 | 5 | 0.0 | 96.8071 | 5 | [122, 580] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_000329__576.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5812 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_231707__766 | 0 | 0.0 | 9.01952 | 0 | [1, 277] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_231707__766.json | 0.0 | missing | missing | missing | |
| 5813 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_231723__534 | 0 | 0.0 | 15.3504 | 0 | [1, 458] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_231723__534.json | 0.0 | missing | missing | missing | |
| 5814 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_174723__299 | 5 | 0.0 | 62.123 | 5 | [163, 360] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_174723__299.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5815 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_174826__961 | 5 | 0.0 | 61.5501 | 5 | [163, 351] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_174826__961.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5816 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_000151__115 | 5 | 0.0 | 73.8858 | 5 | [163, 431] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_000151__115.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5817 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_231633__429 | 0 | 0.0 | 12.5158 | 0 | [1, 361] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_231633__429.json | 0.0 | missing | missing | missing | |
| 5818 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231219_231642__263 | 1 | 0.0 | 9.43716 | 1 | [1, 276] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_231642__263.json | 60.0 | missing | missing | missing | |
| 5819 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_174509__530 | 2 | 0.0 | 100.043 | 4 | [310, 400] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_174509__530.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5820 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_174621__951 | 1 | 0.0 | 71.1954 | 1 | [310, 389] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_174621__951.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5821 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_000036__389 | 1 | 0.0 | 90.759 | 1 | [310, 369] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_000036__389.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5822 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_232119__551 | 0 | 0.0 | 20.8047 | 0 | [1, 560] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_232119__551.json | 25.0 | missing | missing | missing | |
| 5823 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_232144__711 | 0 | 0.0 | 25.1982 | 0 | [1, 668] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_232144__711.json | 0.0 | missing | missing | missing | |
| 5824 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_175643__125 | 4 | 0.0 | 90.9193 | 5 | [451, 479] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_175643__125.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5825 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_175811__721 | 5 | 0.0 | 86.2169 | 5 | [451, 448] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_175811__721.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5826 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_000611__536 | 5 | 0.0 | 84.2061 | 5 | [451, 439] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_000611__536.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5827 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_232007__537 | 0 | 0.0 | 22.9554 | 0 | [1, 614] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_232007__537.json | 0.0 | missing | missing | missing | |
| 5828 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_232040__375 | 0 | 0.0 | 32.9607 | 0 | [1, 851] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_232040__375.json | 0.0 | missing | missing | missing | |
| 5829 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_175409__355 | 0 | 0.0 | 82.4326 | 0 | [449, 429] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_175409__355.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5830 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_175511__215 | 5 | 0.0 | 61.3079 | 5 | [449, 304] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_175511__215.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5831 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_000447__196 | 1 | 0.0 | 77.0941 | 1 | [449, 397] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_000447__196.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5832 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_001606__811 | 1 | 0.0 | 10.4637 | 1 | [121, 395] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_001606__811.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5833 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_112433__113 | 0 | 0.0 | 8.35668 | 0 | [121, 315] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_112433__113.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5834 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_112444__283 | 0 | 0.0 | 10.7785 | 0 | [121, 407] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_112444__283.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5835 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_112454__883 | 0 | 0.0 | 10.0575 | 0 | [121, 380] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_112454__883.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5836 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_001556__425 | 0 | 0.0 | 7.00391 | 0 | [158, 258] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_001556__425.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5837 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_112414__760 | 0 | 0.0 | 10.9344 | 0 | [158, 406] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_112414__760.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5838 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_112420__477 | 0 | 0.0 | 5.86859 | 0 | [158, 214] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_112420__477.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5839 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_112425__500 | 0 | 0.0 | 4.44336 | 0 | [158, 159] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_112425__500.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5840 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_001549__228 | 0 | 0.0 | 11.7197 | 0 | [276, 286] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_001549__228.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5841 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_112344__373 | 0 | 0.0 | 13.6085 | 0 | [276, 360] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_112344__373.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5842 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_112348__921 | 1 | 0.0 | 4.29583 | 1 | [276, 134] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_112348__921.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5843 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_112403__475 | 0 | 0.0 | 14.9905 | 0 | [276, 526] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_112403__475.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5844 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_001626__654 | 0 | 0.0 | 13.6669 | 0 | [410, 449] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_001626__654.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5845 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_112530__831 | 0 | 0.0 | 7.46289 | 0 | [410, 228] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_112530__831.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5846 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_112535__772 | 1 | 0.0 | 5.46074 | 5 | [410, 155] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_112535__772.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5847 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_112543__552 | 1 | 0.0 | 7.38378 | 1 | [410, 225] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_112543__552.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5848 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_001612__497 | 1 | 0.0 | 5.60151 | 1 | [407, 160] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_001612__497.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5849 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_112506__506 | 1 | 0.0 | 12.3622 | 1 | [407, 403] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_112506__506.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5850 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_112512__157 | 0 | 0.0 | 4.94238 | 0 | [407, 135] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_112512__157.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5851 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_112522__286 | 0 | 0.0 | 10.2554 | 0 | [407, 329] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_112522__286.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5852 | Apple-MacBook-Pro-M1 | event_scheduler | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_110148__270 | 1 | 0.0 | 5.17526 | 1 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_110148__270.json | 60.0 | missing | missing | missing | |
| 5853 | Apple-MacBook-Pro-M1 | event_scheduler | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_110158__222 | 0 | 0.0 | 10.3648 | 1 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_110158__222.json | 55.0 | missing | missing | missing | |
| 5854 | Apple-MacBook-Pro-M1 | event_scheduler | gemini-1.0-pro-latest | InJulia | 1SHOT | false | false | 5 | 20240217_110205__751 | 0 | 0.0 | 7.14969 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_110205__751.json | 0.0 | missing | missing | missing | |
| 5855 | Apple-MacBook-Pro-M1 | event_scheduler | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_110211__294 | 0 | 0.0 | 5.70084 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_110211__294.json | 50.0 | missing | missing | missing | |
| 5856 | Apple-MacBook-Pro-M1 | event_scheduler | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_110217__636 | 0 | 0.0 | 6.25683 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_110217__636.json | 50.0 | missing | missing | missing | |
| 5857 | Apple-MacBook-Pro-M1 | event_scheduler | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_110109__262 | 0 | 0.0 | 3.64741 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_110109__262.json | 50.0 | missing | missing | missing | |
| 5858 | Apple-MacBook-Pro-M1 | event_scheduler | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_110112__964 | 0 | 0.0 | 2.86538 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_110112__964.json | 50.0 | missing | missing | missing | |
| 5859 | Apple-MacBook-Pro-M1 | event_scheduler | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_110115__236 | 0 | 0.0 | 2.27605 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_110115__236.json | 50.0 | missing | missing | missing | |
| 5860 | Apple-MacBook-Pro-M1 | event_scheduler | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240217_110118__167 | 0 | 0.0 | 3.31234 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_110118__167.json | 0.0 | missing | missing | missing | |
| 5861 | Apple-MacBook-Pro-M1 | event_scheduler | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_110122__602 | 1 | 0.0 | 3.75496 | 1 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_110122__602.json | 60.0 | missing | missing | missing | |
| 5862 | Apple-MacBook-Pro-M1 | event_scheduler | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240217_110033__815 | 1 | 0.0 | 3.06083 | 1 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_110033__815.json | 60.0 | missing | missing | missing | |
| 5863 | Apple-MacBook-Pro-M1 | event_scheduler | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240217_110036__254 | 0 | 0.0 | 3.12198 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_110036__254.json | 50.0 | missing | missing | missing | |
| 5864 | Apple-MacBook-Pro-M1 | event_scheduler | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240217_110038__888 | 0 | 0.0 | 2.31296 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_110038__888.json | 0.0 | missing | missing | missing | |
| 5865 | Apple-MacBook-Pro-M1 | event_scheduler | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240217_110046__172 | 0 | 0.0 | 7.62171 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_110046__172.json | 0.0 | missing | missing | missing | |
| 5866 | Apple-MacBook-Pro-M1 | event_scheduler | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240217_110049__596 | 1 | 0.0 | 3.32869 | 1 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_110049__596.json | 60.0 | missing | missing | missing | |
| 5867 | Apple-MacBook-Pro-M1 | event_scheduler | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240217_110338__897 | 0 | 0.0 | 3.4185 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_110338__897.json | 50.0 | missing | missing | missing | |
| 5868 | Apple-MacBook-Pro-M1 | event_scheduler | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240217_110341__880 | 0 | 0.0 | 3.19675 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_110341__880.json | 0.0 | missing | missing | missing | |
| 5869 | Apple-MacBook-Pro-M1 | event_scheduler | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240217_110350__332 | 1 | 0.0 | 8.53299 | 1 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_110350__332.json | 60.0 | missing | missing | missing | |
| 5870 | Apple-MacBook-Pro-M1 | event_scheduler | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240217_110431__749 | 0 | 0.0 | 12.7881 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_110431__749.json | 0.0 | missing | missing | missing | |
| 5871 | Apple-MacBook-Pro-M1 | event_scheduler | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240217_113825__249 | 0 | 0.0 | 3.64779 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_113825__249.json | 0.0 | missing | missing | missing | |
| 5872 | Apple-MacBook-Pro-M1 | event_scheduler | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_110252__764 | 0 | 0.0 | 3.84426 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_110252__764.json | 50.0 | missing | missing | missing | |
| 5873 | Apple-MacBook-Pro-M1 | event_scheduler | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_110255__317 | 0 | 0.0 | 3.38561 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_110255__317.json | 50.0 | missing | missing | missing | |
| 5874 | Apple-MacBook-Pro-M1 | event_scheduler | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_110259__554 | 1 | 0.0 | 3.66018 | 1 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_110259__554.json | 60.0 | missing | missing | missing | |
| 5875 | Apple-MacBook-Pro-M1 | event_scheduler | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20240217_110304__509 | 0 | 0.0 | 4.50846 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_110304__509.json | 0.0 | missing | missing | missing | |
| 5876 | Apple-MacBook-Pro-M1 | event_scheduler | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_110309__828 | 0 | 0.0 | 5.48936 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_110309__828.json | 50.0 | missing | missing | missing | |
| 5877 | Apple-MacBook-Pro-M1 | event_scheduler | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | true | 5 | 20240223_225846__306 | 1 | 0.0 | 24.2107 | 1 | [0, 373] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_225846__306.json | 60.0 | missing | missing | missing | |
| 5878 | Apple-MacBook-Pro-M1 | event_scheduler | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 5 | 20240223_225915__252 | 0 | 0.0 | 28.0465 | 0 | [0, 429] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_225915__252.json | 0.0 | missing | missing | missing | |
| 5879 | Apple-MacBook-Pro-M1 | event_scheduler | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 5 | 20240223_225939__161 | 0 | 0.0 | 24.3154 | 0 | [0, 373] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_225939__161.json | 0.0 | missing | missing | missing | |
| 5880 | Apple-MacBook-Pro-M1 | event_scheduler | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | true | 5 | 20240223_230009__918 | 1 | 0.0 | 29.4914 | 1 | [0, 449] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_230009__918.json | 60.0 | missing | missing | missing | |
| 5881 | Apple-MacBook-Pro-M1 | event_scheduler | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | true | 5 | 20240223_230035__793 | 1 | 0.0 | 26.7104 | 1 | [0, 408] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_230035__793.json | 60.0 | missing | missing | missing | |
| 5882 | Apple-MacBook-Pro-M1 | event_scheduler | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240223_225508__461 | 1 | 0.0 | 32.9915 | 1 | [0, 503] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_225508__461.json | 60.0 | missing | missing | missing | |
| 5883 | Apple-MacBook-Pro-M1 | event_scheduler | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240223_225529__423 | 1 | 0.0 | 20.7982 | 1 | [0, 315] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_225529__423.json | 60.0 | missing | missing | missing | |
| 5884 | Apple-MacBook-Pro-M1 | event_scheduler | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240223_225553__115 | 1 | 0.0 | 23.801 | 1 | [0, 361] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_225553__115.json | 60.0 | missing | missing | missing | |
| 5885 | Apple-MacBook-Pro-M1 | event_scheduler | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240223_225615__944 | 0 | 0.0 | 22.329 | 0 | [0, 340] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_225615__944.json | 0.0 | missing | missing | missing | |
| 5886 | Apple-MacBook-Pro-M1 | event_scheduler | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240223_225620__385 | 0 | 0.0 | 5.2747 | 0 | [0, 78] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_225620__385.json | 25.0 | missing | missing | missing | |
| 5887 | Apple-MacBook-Pro-M1 | event_scheduler | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240223_225116__343 | 1 | 0.0 | 38.9354 | 1 | [0, 587] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_225116__343.json | 60.0 | missing | missing | missing | |
| 5888 | Apple-MacBook-Pro-M1 | event_scheduler | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240223_225146__625 | 0 | 0.0 | 29.6806 | 0 | [0, 454] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_225146__625.json | 25.0 | missing | missing | missing | |
| 5889 | Apple-MacBook-Pro-M1 | event_scheduler | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240223_225218__882 | 0 | 0.0 | 31.7728 | 0 | [0, 481] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_225218__882.json | 0.0 | missing | missing | missing | |
| 5890 | Apple-MacBook-Pro-M1 | event_scheduler | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240223_225242__593 | 0 | 0.0 | 24.191 | 0 | [0, 363] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_225242__593.json | 25.0 | missing | missing | missing | |
| 5891 | Apple-MacBook-Pro-M1 | event_scheduler | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240223_225316__176 | 1 | 0.0 | 33.7341 | 1 | [0, 510] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_225316__176.json | 60.0 | missing | missing | missing | |
| 5892 | Apple-MacBook-Pro-M1 | event_scheduler | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240223_230845__675 | 0 | 0.0 | 34.7069 | 0 | [0, 524] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_230845__675.json | 25.0 | missing | missing | missing | |
| 5893 | Apple-MacBook-Pro-M1 | event_scheduler | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_230928__360 | 1 | 0.0 | 43.2625 | 1 | [0, 645] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_230928__360.json | 60.0 | missing | missing | missing | |
| 5894 | Apple-MacBook-Pro-M1 | event_scheduler | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_230955__917 | 1 | 0.0 | 26.5301 | 1 | [0, 399] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_230955__917.json | 60.0 | missing | missing | missing | |
| 5895 | Apple-MacBook-Pro-M1 | event_scheduler | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_231036__335 | 1 | 0.0 | 40.6109 | 1 | [0, 606] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_231036__335.json | 60.0 | missing | missing | missing | |
| 5896 | Apple-MacBook-Pro-M1 | event_scheduler | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_231104__398 | 1 | 0.0 | 27.8945 | 1 | [0, 423] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_231104__398.json | 60.0 | missing | missing | missing | |
| 5897 | Apple-MacBook-Pro-M1 | event_scheduler | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240223_230343__377 | 1 | 0.0 | 23.2748 | 1 | [0, 351] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_230343__377.json | 60.0 | missing | missing | missing | |
| 5898 | Apple-MacBook-Pro-M1 | event_scheduler | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240223_230413__151 | 0 | 0.0 | 30.2129 | 0 | [0, 458] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_230413__151.json | 0.0 | missing | missing | missing | |
| 5899 | Apple-MacBook-Pro-M1 | event_scheduler | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240223_230443__929 | 1 | 0.0 | 30.3328 | 1 | [0, 459] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_230443__929.json | 60.0 | missing | missing | missing | |
| 5900 | Apple-MacBook-Pro-M1 | event_scheduler | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240223_230514__331 | 0 | 0.0 | 30.7265 | 0 | [0, 463] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_230514__331.json | 0.0 | missing | missing | missing | |
| 5901 | Apple-MacBook-Pro-M1 | event_scheduler | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240223_230544__584 | 1 | 0.0 | 29.6423 | 1 | [0, 441] | 0.13.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_230544__584.json | 60.0 | missing | missing | missing | |
| 5902 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | AsIs | 1SHOT | true | true | 5 | 20231213_202750__365 | 1 | 0.0007995 | 14.0465 | 1 | [108, 497] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231213_202750__365.json | 60.0 | missing | missing | missing | |
| 5903 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | AsIs | 1SHOT | true | true | 5 | 20231225_191436__193 | 1 | 0.0007095 | 6.68155 | 1 | [108, 437] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_191436__193.json | 60.0 | missing | missing | missing | |
| 5904 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | AsIs | 1SHOT | true | true | 5 | 20231225_191443__977 | 0 | 0.000666 | 6.7367 | 0 | [108, 408] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_191443__977.json | 50.0 | missing | missing | missing | |
| 5905 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo--optim | AsIs | 1SHOT | true | true | 5 | 20231215_193506__537 | 1 | 0.0 | 8.28616 | 1 | [108, 368] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231215_193506__537.json | 60.0 | 0.5 | missing | 0.5 | |
| 5906 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231213_202734__384 | 4 | 0.000681 | 10.0294 | 5 | [111, 417] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231213_202734__384.json | 95.0 | missing | missing | missing | |
| 5907 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231225_191425__703 | 5 | 0.000534 | 4.99116 | 5 | [111, 319] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_191425__703.json | 100.0 | missing | missing | missing | |
| 5908 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231225_191429__570 | 1 | 0.0004095 | 3.46295 | 1 | [111, 236] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_191429__570.json | 60.0 | missing | missing | missing | |
| 5909 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231227_194926__962 | 3 | 0.000522 | 5.33228 | 4 | [111, 311] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_194926__962.json | 85.0 | missing | missing | missing | |
| 5910 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231227_194931__661 | 4 | 0.000474 | 5.05581 | 5 | [111, 279] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_194931__661.json | 95.0 | missing | missing | missing | |
| 5911 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo--optim | InJulia | 1SHOT | true | true | 5 | 20231215_193456__174 | 4 | 0.0 | 7.47956 | 5 | [111, 346] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231215_193456__174.json | 95.0 | 0.5 | missing | 0.5 | |
| 5912 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_202723__764 | 0 | 0.000523 | 7.40204 | 0 | [146, 300] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231213_202723__764.json | 50.0 | missing | missing | missing | |
| 5913 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_191416__866 | 1 | 0.000214 | 1.92608 | 1 | [146, 94] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_191416__866.json | 60.0 | missing | missing | missing | |
| 5914 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_191419__494 | 1 | 0.0002155 | 1.83943 | 1 | [146, 95] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_191419__494.json | 60.0 | missing | missing | missing | |
| 5915 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_194916__576 | 1 | 0.0004375 | 3.82523 | 1 | [146, 243] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_194916__576.json | 60.0 | missing | missing | missing | |
| 5916 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_194920__288 | 1 | 0.000322 | 3.1241 | 1 | [146, 166] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_194920__288.json | 60.0 | missing | missing | missing | |
| 5917 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_193448__773 | 4 | 0.0 | 4.15382 | 5 | [146, 199] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231215_193448__773.json | 95.0 | 0.5 | missing | 0.5 | |
| 5918 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_202716__631 | 0 | 0.000303 | 3.47413 | 0 | [255, 117] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231213_202716__631.json | 0.0 | missing | missing | missing | |
| 5919 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_191412__221 | 0 | 0.000429 | 3.28096 | 0 | [255, 201] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_191412__221.json | 0.0 | missing | missing | missing | |
| 5920 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_191414__889 | 0 | 0.000318 | 2.47163 | 0 | [255, 127] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_191414__889.json | 0.0 | missing | missing | missing | |
| 5921 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_194910__430 | 0 | 0.0003465 | 2.81761 | 0 | [255, 146] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_194910__430.json | 0.0 | missing | missing | missing | |
| 5922 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_194912__645 | 0 | 0.0002805 | 2.36476 | 0 | [255, 102] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_194912__645.json | 0.0 | missing | missing | missing | |
| 5923 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo--optim | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231215_193444__385 | 0 | 0.0 | 3.17994 | 0 | [255, 134] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231215_193444__385.json | 0.0 | 0.5 | missing | 0.5 | |
| 5924 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231213_202759__637 | 0 | 0.0004485 | 4.94874 | 0 | [369, 176] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231213_202759__637.json | 0.0 | missing | missing | missing | |
| 5925 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_191450__740 | 0 | 0.0003645 | 1.91479 | 0 | [369, 120] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_191450__740.json | 0.0 | missing | missing | missing | |
| 5926 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_191452__641 | 0 | 0.000402 | 2.18437 | 0 | [369, 145] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_191452__641.json | 0.0 | missing | missing | missing | |
| 5927 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_194940__755 | 0 | 0.000369 | 2.12413 | 0 | [369, 123] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_194940__755.json | 0.0 | missing | missing | missing | |
| 5928 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_194944__831 | 0 | 0.0004215 | 3.9083 | 0 | [369, 158] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_194944__831.json | 0.0 | missing | missing | missing | |
| 5929 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo--optim | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231215_193512__944 | 0 | 0.0 | 3.07532 | 0 | [369, 124] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231215_193512__944.json | 0.0 | 0.5 | missing | 0.5 | |
| 5930 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_202754__118 | 0 | 0.000373 | 3.97447 | 0 | [368, 126] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231213_202754__118.json | 0.0 | missing | missing | missing | |
| 5931 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_191446__483 | 0 | 0.0004 | 2.4686 | 0 | [368, 144] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_191446__483.json | 0.0 | missing | missing | missing | |
| 5932 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_191448__819 | 0 | 0.0003925 | 2.57538 | 0 | [368, 139] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_191448__819.json | 0.0 | missing | missing | missing | |
| 5933 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_194937__101 | 0 | 0.00052 | 4.1436 | 0 | [368, 224] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_194937__101.json | 0.0 | missing | missing | missing | |
| 5934 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_194938__623 | 0 | 0.0003295 | 1.70926 | 0 | [368, 97] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_194938__623.json | 0.0 | missing | missing | missing | |
| 5935 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo--optim | JuliaRecapTask | 1SHOT | false | false | 5 | 20231215_193509__761 | 0 | 0.0 | 2.90463 | 0 | [368, 130] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231215_193509__761.json | 0.0 | 0.5 | missing | 0.5 | |
| 5936 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200410__508 | 4 | 0.000528 | 2.3682 | 4 | [111, 315] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200410__508.json | 90.0 | missing | missing | missing | |
| 5937 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200412__884 | 0 | 0.0003465 | 1.63135 | 0 | [111, 194] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200412__884.json | 50.0 | missing | missing | missing | |
| 5938 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200414__652 | 3 | 0.0005055 | 2.10788 | 4 | [111, 300] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200414__652.json | 85.0 | missing | missing | missing | |
| 5939 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200416__563 | 3 | 0.000447 | 1.87659 | 4 | [111, 261] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200416__563.json | 85.0 | missing | missing | missing | |
| 5940 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200418__324 | 0 | 0.0004785 | 2.16801 | 0 | [111, 282] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200418__324.json | 50.0 | missing | missing | missing | |
| 5941 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200403__393 | 1 | 0.0002005 | 1.02281 | 1 | [146, 85] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200403__393.json | 60.0 | missing | missing | missing | |
| 5942 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200405__436 | 1 | 0.0002215 | 1.06408 | 1 | [146, 99] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200405__436.json | 60.0 | missing | missing | missing | |
| 5943 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200406__339 | 1 | 0.000223 | 0.943731 | 1 | [146, 100] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200406__339.json | 60.0 | missing | missing | missing | |
| 5944 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200407__879 | 0 | 0.000223 | 0.906326 | 0 | [146, 100] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200407__879.json | 50.0 | missing | missing | missing | |
| 5945 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200408__124 | 4 | 0.000229 | 0.860462 | 5 | [146, 104] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200408__124.json | 95.0 | missing | missing | missing | |
| 5946 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200355__102 | 5 | 0.0004065 | 1.40112 | 5 | [255, 186] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200355__102.json | 100.0 | missing | missing | missing | |
| 5947 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200357__689 | 4 | 0.000447 | 1.78366 | 5 | [255, 213] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200357__689.json | 95.0 | missing | missing | missing | |
| 5948 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200358__503 | 5 | 0.0003645 | 1.31641 | 5 | [255, 158] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200358__503.json | 100.0 | missing | missing | missing | |
| 5949 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200400__804 | 5 | 0.000573 | 2.08956 | 5 | [255, 297] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200400__804.json | 100.0 | missing | missing | missing | |
| 5950 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_200402__865 | 0 | 0.000429 | 1.74055 | 0 | [255, 201] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200402__865.json | 25.0 | missing | missing | missing | |
| 5951 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200427__421 | 0 | 0.000399 | 1.31771 | 0 | [369, 143] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200427__421.json | 0.0 | missing | missing | missing | |
| 5952 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200427__698 | 0 | 0.0002745 | 0.619528 | 0 | [369, 60] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200427__698.json | 0.0 | missing | missing | missing | |
| 5953 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200428__328 | 0 | 0.0003975 | 1.20299 | 0 | [369, 142] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200428__328.json | 0.0 | missing | missing | missing | |
| 5954 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200429__672 | 0 | 0.000246 | 0.724338 | 0 | [369, 41] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200429__672.json | 0.0 | missing | missing | missing | |
| 5955 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200430__863 | 0 | 0.0003225 | 1.01227 | 0 | [369, 92] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200430__863.json | 0.0 | missing | missing | missing | |
| 5956 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_200420__213 | 0 | 0.0003655 | 0.983389 | 0 | [368, 121] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200420__213.json | 0.0 | missing | missing | missing | |
| 5957 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_200421__591 | 0 | 0.000361 | 1.19434 | 0 | [368, 118] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200421__591.json | 0.0 | missing | missing | missing | |
| 5958 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_200422__534 | 0 | 0.000421 | 1.42022 | 0 | [368, 158] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200422__534.json | 0.0 | missing | missing | missing | |
| 5959 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_200424__824 | 0 | 0.0004255 | 1.45554 | 0 | [368, 161] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200424__824.json | 0.0 | missing | missing | missing | |
| 5960 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200425__273 | 4 | 0.000427 | 1.47479 | 4 | [368, 162] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200425__273.json | 90.0 | missing | missing | missing | |
| 5961 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | AsIs | 1SHOT | true | true | 5 | 20231213_202819__265 | 3 | 0.000662 | 5.17523 | 4 | [108, 277] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231213_202819__265.json | 85.0 | missing | missing | missing | |
| 5962 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | AsIs | 1SHOT | true | true | 5 | 20231225_191510__275 | 1 | 0.000592 | 2.77423 | 1 | [108, 242] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_191510__275.json | 60.0 | missing | missing | missing | |
| 5963 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | AsIs | 1SHOT | true | true | 5 | 20231225_191514__793 | 1 | 0.000494 | 2.63116 | 1 | [108, 193] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_191514__793.json | 60.0 | missing | missing | missing | |
| 5964 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106--optim | AsIs | 1SHOT | true | true | 5 | 20231215_193528__775 | 0 | 0.0 | 3.86361 | 0 | [108, 205] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231215_193528__775.json | 50.0 | 0.9 | missing | 0.1 | |
| 5965 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231213_202814__311 | 3 | 0.000725 | 4.72546 | 4 | [111, 307] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231213_202814__311.json | 85.0 | missing | missing | missing | |
| 5966 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231225_191504__789 | 5 | 0.000661 | 3.40424 | 5 | [111, 275] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_191504__789.json | 100.0 | missing | missing | missing | |
| 5967 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231225_191507__840 | 4 | 0.000751 | 3.66202 | 4 | [111, 320] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_191507__840.json | 90.0 | missing | missing | missing | |
| 5968 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231227_194959__662 | 0 | 0.000659 | 4.08945 | 0 | [111, 274] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_194959__662.json | 50.0 | missing | missing | missing | |
| 5969 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231227_195003__959 | 0 | 0.000687 | 4.61199 | 0 | [111, 288] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_195003__959.json | 50.0 | missing | missing | missing | |
| 5970 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106--optim | InJulia | 1SHOT | true | true | 5 | 20231215_193524__299 | 0 | 0.0 | 4.59509 | 0 | [111, 213] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231215_193524__299.json | 50.0 | 0.9 | missing | 0.1 | |
| 5971 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_202809__751 | 0 | 0.000504 | 5.03026 | 0 | [146, 179] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231213_202809__751.json | 50.0 | missing | missing | missing | |
| 5972 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_191458__727 | 1 | 0.000336 | 1.5056 | 1 | [146, 95] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_191458__727.json | 60.0 | missing | missing | missing | |
| 5973 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_191500__241 | 1 | 0.000326 | 1.32989 | 1 | [146, 90] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_191500__241.json | 60.0 | missing | missing | missing | |
| 5974 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_194952__624 | 4 | 0.000316 | 1.80812 | 5 | [146, 85] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_194952__624.json | 95.0 | missing | missing | missing | |
| 5975 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_194954__793 | 1 | 0.000418 | 2.25695 | 1 | [146, 136] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_194954__793.json | 60.0 | missing | missing | missing | |
| 5976 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_193519__720 | 5 | 0.0 | 2.563 | 5 | [146, 102] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231215_193519__720.json | 100.0 | 0.9 | missing | 0.1 | |
| 5977 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_202803__598 | 4 | 0.000819 | 4.36533 | 5 | [255, 282] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231213_202803__598.json | 95.0 | missing | missing | missing | |
| 5978 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_191454__577 | 0 | 0.000441 | 1.44519 | 0 | [255, 93] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_191454__577.json | 0.0 | missing | missing | missing | |
| 5979 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_191456__617 | 4 | 0.000591 | 2.25327 | 5 | [255, 168] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_191456__617.json | 95.0 | missing | missing | missing | |
| 5980 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_194948__628 | 4 | 0.000631 | 3.28283 | 5 | [255, 188] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_194948__628.json | 95.0 | missing | missing | missing | |
| 5981 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_194950__344 | 0 | 0.000529 | 2.2173 | 0 | [255, 137] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_194950__344.json | 25.0 | missing | missing | missing | |
| 5982 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_193516__276 | 4 | 0.0 | 3.66349 | 5 | [255, 204] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231215_193516__276.json | 95.0 | 0.9 | missing | 0.1 | |
| 5983 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231213_202826__723 | 0 | 0.000491 | 2.05683 | 0 | [369, 61] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231213_202826__723.json | 0.0 | missing | missing | missing | |
| 5984 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_191519__350 | 0 | 0.000445 | 1.01647 | 0 | [369, 38] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_191519__350.json | 0.0 | missing | missing | missing | |
| 5985 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_191520__822 | 0 | 0.000611 | 1.73213 | 0 | [369, 121] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_191520__822.json | 0.0 | missing | missing | missing | |
| 5986 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_195010__194 | 0 | 0.000639 | 2.96164 | 0 | [369, 135] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_195010__194.json | 0.0 | missing | missing | missing | |
| 5987 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_195012__455 | 0 | 0.000471 | 1.15723 | 0 | [369, 51] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_195012__455.json | 0.0 | missing | missing | missing | |
| 5988 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106--optim | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231215_193535__346 | 0 | 0.0 | 2.57133 | 0 | [369, 94] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231215_193535__346.json | 0.0 | 0.9 | missing | 0.1 | |
| 5989 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_202823__245 | 1 | 0.000584 | 3.67932 | 1 | [368, 108] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231213_202823__245.json | 60.0 | missing | missing | missing | |
| 5990 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_191517__379 | 0 | 0.000582 | 1.71293 | 0 | [368, 107] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_191517__379.json | 0.0 | missing | missing | missing | |
| 5991 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_191518__635 | 0 | 0.000472 | 0.975445 | 0 | [368, 52] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_191518__635.json | 0.0 | missing | missing | missing | |
| 5992 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_195005__533 | 0 | 0.000586 | 1.81 | 0 | [368, 109] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_195005__533.json | 0.0 | missing | missing | missing | |
| 5993 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_195008__652 | 0 | 0.000622 | 2.15446 | 0 | [368, 127] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_195008__652.json | 0.0 | missing | missing | missing | |
| 5994 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-3.5-turbo-1106--optim | JuliaRecapTask | 1SHOT | false | false | 5 | 20231215_193532__970 | 0 | 0.0 | 3.81 | 0 | [368, 120] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231215_193532__970.json | 0.0 | 0.9 | missing | 0.1 | |
| 5995 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_090538__401 | 5 | 0.01668 | 41.5913 | 5 | [111, 519] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_090538__401.json | 100.0 | missing | missing | missing | |
| 5996 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_090611__461 | 5 | 0.01539 | 33.4138 | 5 | [111, 476] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_090611__461.json | 100.0 | missing | missing | missing | |
| 5997 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_090648__560 | 4 | 0.01605 | 36.3685 | 4 | [111, 498] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_090648__560.json | 90.0 | missing | missing | missing | |
| 5998 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_090720__494 | 5 | 0.01668 | 32.2745 | 5 | [111, 519] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_090720__494.json | 100.0 | missing | missing | missing | |
| 5999 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_090809__507 | 5 | 0.01947 | 48.1717 | 5 | [111, 612] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_090809__507.json | 100.0 | missing | missing | missing | |
| 6000 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_085946__587 | 4 | 0.00638 | 11.4203 | 4 | [146, 164] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_085946__587.json | 90.0 | missing | missing | missing | |
| 6001 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_085957__726 | 5 | 0.00575 | 11.0742 | 5 | [146, 143] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_085957__726.json | 100.0 | missing | missing | missing | |
| 6002 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_090037__505 | 5 | 0.01037 | 39.5008 | 5 | [146, 297] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_090037__505.json | 100.0 | missing | missing | missing | |
| 6003 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_090050__646 | 5 | 0.00662 | 13.4769 | 5 | [146, 172] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_090050__646.json | 100.0 | missing | missing | missing | |
| 6004 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_090106__577 | 4 | 0.00614 | 15.8118 | 4 | [146, 156] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_090106__577.json | 90.0 | missing | missing | missing | |
| 6005 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_085601__979 | 5 | 0.0168 | 34.4829 | 5 | [255, 475] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_085601__979.json | 100.0 | missing | missing | missing | |
| 6006 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_085641__751 | 5 | 0.01518 | 39.5327 | 5 | [255, 421] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_085641__751.json | 100.0 | missing | missing | missing | |
| 6007 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_085722__960 | 5 | 0.01794 | 40.3911 | 5 | [255, 513] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_085722__960.json | 100.0 | missing | missing | missing | |
| 6008 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_085750__975 | 5 | 0.0132 | 28.554 | 5 | [255, 355] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_085750__975.json | 100.0 | missing | missing | missing | |
| 6009 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_085821__146 | 5 | 0.0159 | 30.0509 | 5 | [255, 445] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_085821__146.json | 100.0 | missing | missing | missing | |
| 6010 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_091936__556 | 5 | 0.02352 | 61.2649 | 5 | [369, 661] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_091936__556.json | 100.0 | missing | missing | missing | |
| 6011 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_092002__262 | 5 | 0.01287 | 25.4196 | 5 | [369, 306] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_092002__262.json | 100.0 | missing | missing | missing | |
| 6012 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_092023__592 | 4 | 0.01212 | 21.5073 | 4 | [369, 281] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_092023__592.json | 90.0 | missing | missing | missing | |
| 6013 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_092103__340 | 4 | 0.02004 | 40.0478 | 4 | [369, 545] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_092103__340.json | 90.0 | missing | missing | missing | |
| 6014 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_092137__144 | 0 | 0.01788 | 33.3993 | 0 | [369, 473] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_092137__144.json | 50.0 | missing | missing | missing | |
| 6015 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_091215__451 | 5 | 0.01634 | 33.8482 | 5 | [368, 422] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_091215__451.json | 100.0 | missing | missing | missing | |
| 6016 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_091326__127 | 4 | 0.01649 | 71.5031 | 4 | [368, 427] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_091326__127.json | 90.0 | missing | missing | missing | |
| 6017 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_091412__456 | 4 | 0.0194 | 45.4711 | 4 | [368, 524] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_091412__456.json | 90.0 | missing | missing | missing | |
| 6018 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_091509__448 | 4 | 0.02051 | 57.0948 | 4 | [368, 561] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_091509__448.json | 90.0 | missing | missing | missing | |
| 6019 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_091613__472 | 4 | 0.02054 | 63.4214 | 4 | [368, 562] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_091613__472.json | 90.0 | missing | missing | missing | |
| 6020 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 5 | 20231213_203045__698 | 0 | 0.01707 | 46.7629 | 0 | [108, 533] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231213_203045__698.json | 0.0 | missing | missing | missing | |
| 6021 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 5 | 20231225_191727__359 | 0 | 0.01449 | 16.5972 | 0 | [108, 447] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_191727__359.json | 0.0 | missing | missing | missing | |
| 6022 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | AsIs | 1SHOT | true | true | 5 | 20231225_191749__643 | 5 | 0.01272 | 22.4796 | 5 | [108, 388] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_191749__643.json | 100.0 | missing | missing | missing | |
| 6023 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview--optim | AsIs | 1SHOT | false | false | 5 | 20231215_193726__181 | 0 | 0.0 | 38.0754 | 0 | [108, 440] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231215_193726__181.json | 0.0 | 0.1 | missing | 0.9 | |
| 6024 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231213_202957__263 | 5 | 0.01512 | 33.2801 | 5 | [111, 467] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231213_202957__263.json | 100.0 | missing | missing | missing | |
| 6025 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231225_191650__708 | 4 | 0.01548 | 19.863 | 4 | [111, 479] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_191650__708.json | 90.0 | missing | missing | missing | |
| 6026 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231225_191710__648 | 4 | 0.01719 | 19.0097 | 4 | [111, 536] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_191710__648.json | 90.0 | missing | missing | missing | |
| 6027 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231227_195141__948 | 4 | 0.01551 | 31.2938 | 4 | [111, 480] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_195141__948.json | 90.0 | missing | missing | missing | |
| 6028 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | InJulia | 1SHOT | true | false | 5 | 20231227_195220__798 | 0 | 0.01821 | 37.6828 | 0 | [111, 570] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_195220__798.json | 25.0 | missing | missing | missing | |
| 6029 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview--optim | InJulia | 1SHOT | true | true | 5 | 20231215_193647__129 | 4 | 0.0 | 46.197 | 4 | [111, 537] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231215_193647__129.json | 90.0 | 0.1 | missing | 0.9 | |
| 6030 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_202923__464 | 4 | 0.00569 | 13.0066 | 4 | [146, 141] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231213_202923__464.json | 90.0 | missing | missing | missing | |
| 6031 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_191620__266 | 1 | 0.00869 | 11.5586 | 1 | [146, 241] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_191620__266.json | 60.0 | missing | missing | missing | |
| 6032 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_191630__676 | 1 | 0.00722 | 9.251 | 1 | [146, 192] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_191630__676.json | 60.0 | missing | missing | missing | |
| 6033 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_195054__175 | 0 | 0.00731 | 11.5197 | 0 | [146, 195] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_195054__175.json | 50.0 | missing | missing | missing | |
| 6034 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_195110__424 | 5 | 0.00677 | 15.6628 | 5 | [146, 177] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_195110__424.json | 100.0 | missing | missing | missing | |
| 6035 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_193600__635 | 5 | 0.0 | 14.1594 | 5 | [146, 201] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231215_193600__635.json | 100.0 | 0.1 | missing | 0.9 | |
| 6036 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_202910__441 | 5 | 0.01266 | 44.0673 | 5 | [255, 337] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231213_202910__441.json | 100.0 | missing | missing | missing | |
| 6037 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_191542__721 | 5 | 0.02013 | 21.3288 | 5 | [255, 586] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_191542__721.json | 100.0 | missing | missing | missing | |
| 6038 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_191608__922 | 4 | 0.02055 | 24.7674 | 4 | [255, 600] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_191608__922.json | 90.0 | missing | missing | missing | |
| 6039 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_195036__600 | 0 | 0.01182 | 24.6259 | 0 | [255, 309] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_195036__600.json | 25.0 | missing | missing | missing | |
| 6040 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_195042__464 | 0 | 0.00432 | 5.90107 | 0 | [255, 59] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_195042__464.json | 0.0 | missing | missing | missing | |
| 6041 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview--optim | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231215_193546__995 | 0 | 0.0 | 11.4798 | 0 | [255, 159] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231215_193546__995.json | 0.0 | 0.1 | missing | 0.9 | |
| 6042 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231213_203153__676 | 0 | 0.01833 | 27.7441 | 0 | [369, 488] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231213_203153__676.json | 0.0 | missing | missing | missing | |
| 6043 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_191853__536 | 1 | 0.01839 | 17.5087 | 1 | [369, 490] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_191853__536.json | 60.0 | missing | missing | missing | |
| 6044 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_191919__960 | 5 | 0.01971 | 25.6875 | 5 | [369, 534] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_191919__960.json | 100.0 | missing | missing | missing | |
| 6045 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_195358__593 | 0 | 0.01905 | 43.3121 | 0 | [369, 512] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_195358__593.json | 25.0 | missing | missing | missing | |
| 6046 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_195436__186 | 5 | 0.01614 | 38.1764 | 5 | [369, 415] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_195436__186.json | 100.0 | missing | missing | missing | |
| 6047 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_193900__258 | 0 | 0.0 | 60.6001 | 0 | [369, 476] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231215_193900__258.json | 50.0 | 0.1 | missing | 0.9 | |
| 6048 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_203125__775 | 0 | 0.02054 | 40.2743 | 0 | [368, 562] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231213_203125__775.json | 50.0 | missing | missing | missing | |
| 6049 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_191816__110 | 5 | 0.02096 | 26.1196 | 5 | [368, 576] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_191816__110.json | 100.0 | missing | missing | missing | |
| 6050 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_191835__906 | 0 | 0.01691 | 17.5912 | 0 | [368, 441] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_191835__906.json | 50.0 | missing | missing | missing | |
| 6051 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_195249__441 | 1 | 0.01394 | 29.3008 | 1 | [368, 342] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_195249__441.json | 60.0 | missing | missing | missing | |
| 6052 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_195314__423 | 0 | 0.01322 | 24.3107 | 0 | [368, 318] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_195314__423.json | 50.0 | missing | missing | missing | |
| 6053 | Apple-MacBook-Pro-M1 | event_scheduler | gpt-4-1106-preview--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_193759__674 | 0 | 0.0 | 33.1212 | 0 | [368, 448] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231215_193759__674.json | 50.0 | 0.1 | missing | 0.9 | |
| 6054 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | AsIs | 1SHOT | false | false | 5 | 20231214_003912__772 | 0 | 0.0 | 14.9002 | 0 | [107, 428] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__AsIs__1SHOT__20231214_003912__772.json | 0.0 | missing | missing | missing | |
| 6055 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | AsIs | 1SHOT | false | false | 5 | 20231225_165158__350 | 0 | 0.0 | 14.9759 | 0 | [107, 432] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__AsIs__1SHOT__20231225_165158__350.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6056 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | AsIs | 1SHOT | false | false | 5 | 20231225_165213__395 | 0 | 0.0 | 15.536 | 0 | [1, 470] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__AsIs__1SHOT__20231225_165213__395.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6057 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | InJulia | 1SHOT | true | true | 5 | 20231214_003857__675 | 1 | 0.0 | 15.2401 | 1 | [124, 436] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__InJulia__1SHOT__20231214_003857__675.json | 60.0 | missing | missing | missing | |
| 6058 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | InJulia | 1SHOT | false | false | 5 | 20231225_165128__210 | 0 | 0.0 | 19.0368 | 0 | [124, 543] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__InJulia__1SHOT__20231225_165128__210.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6059 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | InJulia | 1SHOT | true | true | 5 | 20231225_165142__736 | 1 | 0.0 | 14.7271 | 1 | [1, 447] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__InJulia__1SHOT__20231225_165142__736.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6060 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | InJulia | 1SHOT | false | false | 5 | 20231226_234205__117 | 0 | 0.0 | 13.2836 | 0 | [124, 386] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__InJulia__1SHOT__20231226_234205__117.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6061 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_003841__348 | 0 | 0.0 | 4.40315 | 0 | [153, 104] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaExpertAsk__1SHOT__20231214_003841__348.json | 0.0 | missing | missing | missing | |
| 6062 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_165056__536 | 0 | 0.0 | 13.8085 | 0 | [153, 385] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_165056__536.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6063 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_165108__794 | 1 | 0.0 | 11.955 | 1 | [1, 363] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_165108__794.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6064 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_234152__841 | 1 | 0.0 | 16.9356 | 1 | [153, 478] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaExpertAsk__1SHOT__20231226_234152__841.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6065 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_003837__137 | 0 | 0.0 | 16.0641 | 0 | [300, 387] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_003837__137.json | 0.0 | missing | missing | missing | |
| 6066 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_165019__283 | 0 | 0.0 | 22.6332 | 0 | [318, 433] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_165019__283.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6067 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_165043__418 | 0 | 0.0 | 23.8932 | 0 | [1, 658] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_165043__418.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6068 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_234135__616 | 0 | 0.0 | 14.9713 | 0 | [318, 233] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_234135__616.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6069 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_003958__927 | 0 | 0.0 | 15.1994 | 0 | [11, 412] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_003958__927.json | 0.0 | missing | missing | missing | |
| 6070 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_165323__755 | 0 | 0.0 | 22.2524 | 0 | [11, 592] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_165323__755.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6071 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_165357__114 | 0 | 0.0 | 34.1486 | 0 | [1, 879] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_165357__114.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6072 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_234236__775 | 0 | 0.0 | 18.4574 | 0 | [11, 502] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_234236__775.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6073 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_003943__938 | 0 | 0.0 | 30.8686 | 0 | [424, 705] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaRecapTask__1SHOT__20231214_003943__938.json | 0.0 | missing | missing | missing | |
| 6074 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_165238__201 | 1 | 0.0 | 24.5327 | 1 | [424, 553] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_165238__201.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6075 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_165300__606 | 0 | 0.0 | 22.4514 | 0 | [1, 602] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_165300__606.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6076 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_234218__377 | 0 | 0.0 | 12.3287 | 0 | [424, 237] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaRecapTask__1SHOT__20231226_234218__377.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6077 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | AsIs | 1SHOT | false | false | 5 | 20231214_005048__236 | 0 | 0.0 | 15.8354 | 0 | [107, 454] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__AsIs__1SHOT__20231214_005048__236.json | 0.0 | missing | missing | missing | |
| 6078 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | AsIs | 1SHOT | false | false | 5 | 20231225_172312__337 | 0 | 0.0 | 10.8018 | 0 | [121, 348] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__AsIs__1SHOT__20231225_172312__337.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6079 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | AsIs | 1SHOT | false | false | 5 | 20231225_172321__145 | 0 | 0.0 | 8.81566 | 0 | [121, 283] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__AsIs__1SHOT__20231225_172321__145.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6080 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | InJulia | 1SHOT | true | true | 5 | 20231214_005032__550 | 1 | 0.0 | 16.6628 | 1 | [124, 475] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__InJulia__1SHOT__20231214_005032__550.json | 60.0 | missing | missing | missing | |
| 6081 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_172250__376 | 0 | 0.0 | 16.52 | 0 | [124, 535] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__InJulia__1SHOT__20231225_172250__376.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6082 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_172301__325 | 1 | 0.0 | 10.0953 | 1 | [124, 325] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__InJulia__1SHOT__20231225_172301__325.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6083 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | InJulia | 1SHOT | true | true | 5 | 20231226_235303__252 | 1 | 0.0 | 11.746 | 1 | [124, 377] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__InJulia__1SHOT__20231226_235303__252.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6084 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_005015__540 | 0 | 0.0 | 10.9916 | 0 | [153, 303] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231214_005015__540.json | 0.0 | missing | missing | missing | |
| 6085 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_172221__340 | 0 | 0.0 | 10.6509 | 0 | [163, 333] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_172221__340.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6086 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_172233__463 | 2 | 0.0 | 12.2684 | 4 | [163, 387] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_172233__463.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6087 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_235251__342 | 0 | 0.0 | 14.0741 | 0 | [163, 442] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231226_235251__342.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6088 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_005004__757 | 0 | 0.0 | 29.1408 | 0 | [300, 723] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231214_005004__757.json | 0.0 | missing | missing | missing | |
| 6089 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_172156__868 | 0 | 0.0 | 19.6631 | 0 | [310, 402] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_172156__868.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6090 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_172210__362 | 0 | 0.0 | 13.0568 | 0 | [310, 384] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_172210__362.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6091 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_235237__864 | 1 | 0.0 | 17.3382 | 1 | [310, 335] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231226_235237__864.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6092 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_005124__786 | 0 | 0.0 | 12.8878 | 0 | [11, 352] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231214_005124__786.json | 0.0 | missing | missing | missing | |
| 6093 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_172407__211 | 0 | 0.0 | 11.9519 | 0 | [427, 324] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_172407__211.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6094 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_172425__934 | 0 | 0.0 | 18.2161 | 0 | [427, 515] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_172425__934.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6095 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_235332__786 | 3 | 0.0 | 15.0047 | 5 | [427, 418] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231226_235332__786.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6096 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_005111__137 | 1 | 0.0 | 23.2406 | 1 | [424, 518] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaRecapTask__1SHOT__20231214_005111__137.json | 60.0 | missing | missing | missing | |
| 6097 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_172340__967 | 1 | 0.0 | 18.6923 | 1 | [424, 534] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_172340__967.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6098 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_172354__191 | 0 | 0.0 | 14.2659 | 0 | [424, 398] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_172354__191.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6099 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_235317__579 | 0 | 0.0 | 13.4932 | 0 | [424, 371] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaRecapTask__1SHOT__20231226_235317__579.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6100 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_180522__266 | 0 | 0.0 | 14.3524 | 0 | [124, 272] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_180522__266.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6101 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_180539__517 | 0 | 0.0 | 16.9282 | 0 | [124, 323] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_180539__517.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6102 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_180556__892 | 1 | 0.0 | 17.2363 | 1 | [124, 329] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_180556__892.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6103 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_180426__479 | 1 | 0.0 | 20.0131 | 1 | [163, 377] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_180426__479.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6104 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_180443__579 | 1 | 0.0 | 17.2089 | 1 | [163, 322] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_180443__579.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6105 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_180507__664 | 1 | 0.0 | 23.1992 | 1 | [163, 432] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_180507__664.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6106 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_180322__815 | 1 | 0.0 | 24.1334 | 1 | [310, 437] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_180322__815.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6107 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_180339__589 | 4 | 0.0 | 16.5766 | 4 | [310, 292] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_180339__589.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6108 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_180406__353 | 0 | 0.0 | 25.3634 | 0 | [310, 460] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_180406__353.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6109 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_180719__986 | 0 | 0.0 | 27.1019 | 0 | [427, 433] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_180719__986.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6110 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_180743__786 | 0 | 0.0 | 23.9632 | 0 | [427, 407] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_180743__786.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6111 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_180801__977 | 0 | 0.0 | 17.861 | 0 | [427, 298] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_180801__977.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6112 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_180608__770 | 1 | 0.0 | 11.6416 | 1 | [424, 179] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_180608__770.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6113 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_180634__509 | 1 | 0.0 | 25.177 | 1 | [424, 436] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_180634__509.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6114 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_180652__563 | 1 | 0.0 | 16.6992 | 1 | [424, 263] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_180652__563.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6115 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | AsIs | 1SHOT | false | false | 5 | 20231213_201221__326 | 0 | 0.00342786 | 8.60846 | 0 | [119, 384] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__AsIs__1SHOT__20231213_201221__326.json | 0.0 | missing | missing | missing | |
| 6116 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | AsIs | 1SHOT | true | false | 5 | 20231225_192349__892 | 0 | 0.00467372 | 12.0975 | 0 | [119, 538] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__AsIs__1SHOT__20231225_192349__892.json | 25.0 | missing | missing | missing | |
| 6117 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | AsIs | 1SHOT | true | true | 5 | 20231225_192358__421 | 1 | 0.00357348 | 8.95747 | 1 | [119, 402] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__AsIs__1SHOT__20231225_192358__421.json | 60.0 | missing | missing | missing | |
| 6118 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium--optim | AsIs | 1SHOT | true | false | 5 | 20231215_194114__181 | 0 | 0.0 | 26.8145 | 0 | [119, 569] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__AsIs__1SHOT__20231215_194114__181.json | 25.0 | 0.9 | missing | 0.3 | |
| 6119 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231213_201211__875 | 1 | 0.00463328 | 11.8663 | 1 | [122, 532] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__InJulia__1SHOT__20231213_201211__875.json | 60.0 | missing | missing | missing | |
| 6120 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | InJulia | 1SHOT | false | false | 5 | 20231225_192323__768 | 0 | 0.00431777 | 11.0168 | 0 | [122, 493] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__InJulia__1SHOT__20231225_192323__768.json | 0.0 | missing | missing | missing | |
| 6121 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231225_192336__917 | 0 | 0.00501351 | 13.1434 | 0 | [122, 579] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__InJulia__1SHOT__20231225_192336__917.json | 50.0 | missing | missing | missing | |
| 6122 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231227_195823__503 | 1 | 0.00410743 | 12.6743 | 1 | [122, 467] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__InJulia__1SHOT__20231227_195823__503.json | 60.0 | missing | missing | missing | |
| 6123 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231227_195847__289 | 5 | 0.00398608 | 23.1239 | 5 | [122, 452] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__InJulia__1SHOT__20231227_195847__289.json | 100.0 | missing | missing | missing | |
| 6124 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium--optim | InJulia | 1SHOT | true | true | 5 | 20231215_194046__209 | 0 | 0.0 | 14.8562 | 0 | [122, 587] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__InJulia__1SHOT__20231215_194046__209.json | 50.0 | 0.9 | missing | 0.3 | |
| 6125 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_201159__385 | 3 | 0.00246529 | 5.69265 | 5 | [161, 251] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231213_201159__385.json | 90.0 | missing | missing | missing | |
| 6126 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_192305__316 | 1 | 0.00301541 | 13.1357 | 1 | [161, 319] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_192305__316.json | 60.0 | missing | missing | missing | |
| 6127 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_192312__513 | 0 | 0.00265136 | 6.81044 | 0 | [161, 274] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_192312__513.json | 0.0 | missing | missing | missing | |
| 6128 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_195803__578 | 1 | 0.00409138 | 17.8041 | 1 | [161, 452] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_195803__578.json | 60.0 | missing | missing | missing | |
| 6129 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_195810__313 | 0 | 0.00277271 | 6.5917 | 0 | [161, 289] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_195810__313.json | 0.0 | missing | missing | missing | |
| 6130 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_194031__648 | 1 | 0.0 | 5.63052 | 1 | [161, 250] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231215_194031__648.json | 60.0 | 0.9 | missing | 0.3 | |
| 6131 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_201153__405 | 0 | 0.00467435 | 20.3236 | 0 | [308, 475] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231213_201153__405.json | 0.0 | missing | missing | missing | |
| 6132 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_192233__724 | 1 | 0.00561279 | 18.0081 | 1 | [308, 591] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_192233__724.json | 60.0 | missing | missing | missing | |
| 6133 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_192251__183 | 1 | 0.00549953 | 18.0042 | 1 | [308, 577] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_192251__183.json | 60.0 | missing | missing | missing | |
| 6134 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_195714__172 | 0 | 0.00598493 | 22.1484 | 0 | [308, 637] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_195714__172.json | 0.0 | missing | missing | missing | |
| 6135 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_195745__578 | 0 | 0.00920475 | 30.8496 | 0 | [308, 1035] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_195745__578.json | 0.0 | missing | missing | missing | |
| 6136 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_194024__986 | 1 | 0.0 | 13.653 | 1 | [308, 606] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231215_194024__986.json | 60.0 | 0.9 | missing | 0.3 | |
| 6137 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_201249__139 | 0 | 0.00552958 | 12.4298 | 0 | [424, 542] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231213_201249__139.json | 50.0 | missing | missing | missing | |
| 6138 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_192449__611 | 0 | 0.00613633 | 18.8159 | 0 | [424, 617] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_192449__611.json | 50.0 | missing | missing | missing | |
| 6139 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_192508__568 | 0 | 0.00573992 | 18.4144 | 0 | [424, 568] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_192508__568.json | 50.0 | missing | missing | missing | |
| 6140 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_200016__960 | 0 | 0.00557003 | 12.4782 | 0 | [424, 547] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_200016__960.json | 0.0 | missing | missing | missing | |
| 6141 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_200030__658 | 0 | 0.00625768 | 14.2148 | 0 | [424, 632] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_200030__658.json | 0.0 | missing | missing | missing | |
| 6142 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_194207__435 | 0 | 0.0 | 27.9499 | 0 | [424, 563] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231215_194207__435.json | 50.0 | 0.9 | missing | 0.3 | |
| 6143 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_201236__244 | 0 | 0.0063952 | 14.9419 | 0 | [421, 650] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231213_201236__244.json | 50.0 | missing | missing | missing | |
| 6144 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_192412__420 | 0 | 0.00593407 | 13.8231 | 0 | [421, 593] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_192412__420.json | 50.0 | missing | missing | missing | |
| 6145 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_192430__690 | 0 | 0.00724465 | 17.5978 | 0 | [421, 755] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_192430__690.json | 0.0 | missing | missing | missing | |
| 6146 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_195920__208 | 0 | 0.0059907 | 32.2186 | 0 | [421, 600] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_195920__208.json | 0.0 | missing | missing | missing | |
| 6147 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_200003__791 | 0 | 0.00538395 | 42.2822 | 0 | [421, 525] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_200003__791.json | 50.0 | missing | missing | missing | |
| 6148 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-medium--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_194138__498 | 0 | 0.0 | 24.0121 | 0 | [421, 492] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231215_194138__498.json | 50.0 | 0.9 | missing | 0.3 | |
| 6149 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | AsIs | 1SHOT | true | true | 5 | 20231213_203325__211 | 5 | 0.00131601 | 8.53361 | 5 | [118, 639] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__AsIs__1SHOT__20231213_203325__211.json | 100.0 | missing | missing | missing | |
| 6150 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | AsIs | 1SHOT | true | true | 5 | 20231225_192115__186 | 5 | 0.00135287 | 9.06739 | 5 | [118, 658] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__AsIs__1SHOT__20231225_192115__186.json | 100.0 | missing | missing | missing | |
| 6151 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | AsIs | 1SHOT | true | true | 5 | 20231225_192124__547 | 5 | 0.00118409 | 7.72981 | 5 | [118, 571] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__AsIs__1SHOT__20231225_192124__547.json | 100.0 | missing | missing | missing | |
| 6152 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small--optim | AsIs | 1SHOT | true | true | 5 | 20231215_193950__595 | 5 | 0.0 | 7.95256 | 5 | [118, 604] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__AsIs__1SHOT__20231215_193950__595.json | 100.0 | 0.9 | missing | 0.3 | |
| 6153 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231213_203315__765 | 1 | 0.00127333 | 8.41458 | 1 | [121, 616] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__InJulia__1SHOT__20231213_203315__765.json | 60.0 | missing | missing | missing | |
| 6154 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | InJulia | 1SHOT | false | false | 5 | 20231225_192056__233 | 0 | 0.000221847 | 1.15771 | 0 | [121, 74] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__InJulia__1SHOT__20231225_192056__233.json | 0.0 | missing | missing | missing | |
| 6155 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231225_192105__640 | 1 | 0.00139749 | 9.18409 | 1 | [121, 680] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__InJulia__1SHOT__20231225_192105__640.json | 60.0 | missing | missing | missing | |
| 6156 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231227_195602__194 | 3 | 0.00151583 | 9.9888 | 5 | [121, 741] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__InJulia__1SHOT__20231227_195602__194.json | 90.0 | missing | missing | missing | |
| 6157 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231227_195611__239 | 1 | 0.00124811 | 8.2163 | 1 | [121, 603] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__InJulia__1SHOT__20231227_195611__239.json | 60.0 | missing | missing | missing | |
| 6158 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small--optim | InJulia | 1SHOT | false | false | 5 | 20231215_193942__424 | 0 | 0.0 | 1.13892 | 0 | [121, 74] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__InJulia__1SHOT__20231215_193942__424.json | 0.0 | 0.9 | missing | 0.3 | |
| 6159 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_203305__419 | 3 | 0.00103019 | 9.34053 | 5 | [162, 477] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231213_203305__419.json | 90.0 | missing | missing | missing | |
| 6160 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_192050__623 | 5 | 0.000638314 | 3.80959 | 5 | [162, 275] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_192050__623.json | 100.0 | missing | missing | missing | |
| 6161 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_192054__685 | 1 | 0.000572354 | 3.3517 | 1 | [162, 241] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_192054__685.json | 60.0 | missing | missing | missing | |
| 6162 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_195545__410 | 1 | 0.000582054 | 3.40662 | 1 | [162, 246] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_195545__410.json | 60.0 | missing | missing | missing | |
| 6163 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_195551__448 | 1 | 0.000842014 | 5.11582 | 1 | [162, 380] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_195551__448.json | 60.0 | missing | missing | missing | |
| 6164 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_193940__528 | 1 | 0.0 | 4.47399 | 1 | [162, 331] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231215_193940__528.json | 60.0 | 0.9 | missing | 0.3 | |
| 6165 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_203255__952 | 5 | 0.00118544 | 16.018 | 5 | [309, 508] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231213_203255__952.json | 100.0 | missing | missing | missing | |
| 6166 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_192039__563 | 1 | 0.00108456 | 6.22248 | 5 | [309, 456] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_192039__563.json | 80.0 | missing | missing | missing | |
| 6167 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_192045__524 | 0 | 0.000888623 | 4.93702 | 1 | [309, 355] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_192045__524.json | 55.0 | missing | missing | missing | |
| 6168 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_195534__644 | 1 | 0.00108844 | 6.18568 | 1 | [309, 458] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_195534__644.json | 60.0 | missing | missing | missing | |
| 6169 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_195541__899 | 4 | 0.0010768 | 6.20711 | 4 | [309, 452] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_195541__899.json | 90.0 | missing | missing | missing | |
| 6170 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small--optim | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231215_193935__337 | 0 | 0.0 | 6.9171 | 0 | [309, 479] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231215_193935__337.json | 25.0 | 0.9 | missing | 0.3 | |
| 6171 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231213_203344__171 | 0 | 0.00150688 | 8.64488 | 0 | [428, 634] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231213_203344__171.json | 25.0 | missing | missing | missing | |
| 6172 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_192206__831 | 0 | 0.00198024 | 12.0558 | 0 | [428, 878] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_192206__831.json | 25.0 | missing | missing | missing | |
| 6173 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_192214__615 | 0 | 0.00146614 | 8.44815 | 0 | [428, 613] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_192214__615.json | 25.0 | missing | missing | missing | |
| 6174 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_195643__197 | 0 | 0.0014545 | 8.25583 | 0 | [428, 607] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_195643__197.json | 50.0 | missing | missing | missing | |
| 6175 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_195652__583 | 0 | 0.00150106 | 8.61653 | 0 | [428, 631] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_195652__583.json | 25.0 | missing | missing | missing | |
| 6176 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small--optim | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231215_194010__873 | 0 | 0.0 | 8.02773 | 0 | [428, 595] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231215_194010__873.json | 0.0 | 0.9 | missing | 0.3 | |
| 6177 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | JuliaRecapTask | 1SHOT | true | false | 5 | 20231213_203336__943 | 0 | 0.0017151 | 10.103 | 0 | [426, 742] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231213_203336__943.json | 25.0 | missing | missing | missing | |
| 6178 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_192138__665 | 1 | 0.00177912 | 12.5568 | 1 | [426, 775] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_192138__665.json | 60.0 | missing | missing | missing | |
| 6179 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_192153__146 | 1 | 0.00221756 | 13.7601 | 1 | [426, 1001] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_192153__146.json | 60.0 | missing | missing | missing | |
| 6180 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_195622__677 | 1 | 0.00165302 | 9.78872 | 1 | [426, 710] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_195622__677.json | 60.0 | missing | missing | missing | |
| 6181 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_195633__447 | 4 | 0.00187806 | 11.2791 | 5 | [426, 826] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_195633__447.json | 95.0 | missing | missing | missing | |
| 6182 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-small--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_194001__227 | 5 | 0.0 | 10.3335 | 5 | [426, 742] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231215_194001__227.json | 100.0 | 0.9 | missing | 0.3 | |
| 6183 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231213_203221__410 | 0 | 0.000209045 | 6.73764 | 0 | [118, 425] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__AsIs__1SHOT__20231213_203221__410.json | 0.0 | missing | missing | missing | |
| 6184 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | AsIs | 1SHOT | true | false | 5 | 20231225_191953__328 | 0 | 0.000189113 | 4.24167 | 0 | [118, 381] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__AsIs__1SHOT__20231225_191953__328.json | 25.0 | missing | missing | missing | |
| 6185 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231225_192000__720 | 0 | 0.000351287 | 6.30819 | 0 | [118, 739] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__AsIs__1SHOT__20231225_192000__720.json | 0.0 | missing | missing | missing | |
| 6186 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny--optim | AsIs | 1SHOT | true | true | 5 | 20231215_193916__391 | 0 | 0.0 | 3.90948 | 0 | [118, 444] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__AsIs__1SHOT__20231215_193916__391.json | 50.0 | 0.9 | missing | 0.3 | |
| 6187 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231213_203214__561 | 0 | 0.000146498 | 4.54067 | 0 | [121, 286] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__InJulia__1SHOT__20231213_203214__561.json | 50.0 | missing | missing | missing | |
| 6188 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231225_191945__868 | 0 | 0.000173225 | 3.06425 | 0 | [121, 345] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__InJulia__1SHOT__20231225_191945__868.json | 50.0 | missing | missing | missing | |
| 6189 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231225_191949__950 | 0 | 0.000226679 | 4.10358 | 0 | [121, 463] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__InJulia__1SHOT__20231225_191949__950.json | 50.0 | missing | missing | missing | |
| 6190 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | InJulia | 1SHOT | false | false | 5 | 20231227_195503__838 | 0 | 0.000266543 | 4.92137 | 0 | [121, 551] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__InJulia__1SHOT__20231227_195503__838.json | 0.0 | missing | missing | missing | |
| 6191 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231227_195507__915 | 0 | 0.000185909 | 3.40176 | 0 | [121, 373] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__InJulia__1SHOT__20231227_195507__915.json | 50.0 | missing | missing | missing | |
| 6192 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny--optim | InJulia | 1SHOT | false | false | 5 | 20231215_193912__278 | 0 | 0.0 | 3.38017 | 0 | [121, 388] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__InJulia__1SHOT__20231215_193912__278.json | 0.0 | 0.9 | missing | 0.3 | |
| 6193 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_203209__363 | 0 | 0.000204333 | 5.82275 | 0 | [162, 401] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231213_203209__363.json | 50.0 | missing | missing | missing | |
| 6194 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_191939__565 | 0 | 0.000100596 | 2.25635 | 0 | [162, 172] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_191939__565.json | 50.0 | missing | missing | missing | |
| 6195 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_191942__189 | 0 | 0.000162204 | 2.79058 | 0 | [162, 308] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_191942__189.json | 0.0 | missing | missing | missing | |
| 6196 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_195455__358 | 0 | 0.000179418 | 3.19883 | 0 | [162, 346] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_195455__358.json | 0.0 | missing | missing | missing | |
| 6197 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_195458__556 | 0 | 0.000181683 | 3.13071 | 0 | [162, 351] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_195458__556.json | 50.0 | missing | missing | missing | |
| 6198 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny--optim | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231215_193908__514 | 0 | 0.0 | 2.29118 | 0 | [162, 263] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231215_193908__514.json | 25.0 | 0.9 | missing | 0.3 | |
| 6199 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231213_203203__297 | 0 | 0.000257529 | 9.96646 | 0 | [309, 473] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231213_203203__297.json | 25.0 | missing | missing | missing | |
| 6200 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_191932__568 | 1 | 0.000356283 | 11.3704 | 1 | [309, 691] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_191932__568.json | 60.0 | missing | missing | missing | |
| 6201 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_191936__548 | 0 | 0.000240315 | 4.08988 | 0 | [309, 435] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_191936__548.json | 25.0 | missing | missing | missing | |
| 6202 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_195446__553 | 0 | 0.000209964 | 9.24456 | 0 | [309, 368] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_195446__553.json | 50.0 | missing | missing | missing | |
| 6203 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_195451__171 | 0 | 0.000283803 | 4.90761 | 0 | [309, 531] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_195451__171.json | 50.0 | missing | missing | missing | |
| 6204 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_193905__386 | 0 | 0.0 | 5.26826 | 0 | [309, 382] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231215_193905__386.json | 50.0 | 0.9 | missing | 0.3 | |
| 6205 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_203239__799 | 0 | 0.000519715 | 13.2908 | 0 | [428, 1015] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231213_203239__799.json | 50.0 | missing | missing | missing | |
| 6206 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_192028__923 | 0 | 0.000429115 | 12.8431 | 0 | [428, 815] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_192028__923.json | 50.0 | missing | missing | missing | |
| 6207 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_192033__141 | 0 | 0.000247009 | 4.96681 | 0 | [428, 413] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_192033__141.json | 50.0 | missing | missing | missing | |
| 6208 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_195524__208 | 1 | 0.000262864 | 4.18211 | 1 | [428, 448] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_195524__208.json | 60.0 | missing | missing | missing | |
| 6209 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_195528__302 | 0 | 0.000225718 | 3.401 | 0 | [428, 366] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_195528__302.json | 50.0 | missing | missing | missing | |
| 6210 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_193928__730 | 0 | 0.0 | 6.41131 | 0 | [428, 703] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231215_193928__730.json | 50.0 | 0.9 | missing | 0.3 | |
| 6211 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_203226__653 | 0 | 0.000229515 | 4.87198 | 0 | [426, 375] | 0.10.0-DEV | 5 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231213_203226__653.json | 0.0 | missing | missing | missing | |
| 6212 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_192011__678 | 0 | 0.000388518 | 11.227 | 0 | [426, 726] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_192011__678.json | 50.0 | missing | missing | missing | |
| 6213 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_192015__824 | 0 | 0.000275268 | 4.30126 | 0 | [426, 476] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_192015__824.json | 0.0 | missing | missing | missing | |
| 6214 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_195513__750 | 1 | 0.00031332 | 5.25005 | 1 | [426, 560] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_195513__750.json | 60.0 | missing | missing | missing | |
| 6215 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_195520__289 | 0 | 0.000371757 | 6.51567 | 0 | [426, 689] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_195520__289.json | 0.0 | missing | missing | missing | |
| 6216 | Apple-MacBook-Pro-M1 | event_scheduler | mistral-tiny--optim | JuliaRecapTask | 1SHOT | true | false | 5 | 20231215_193921__455 | 0 | 0.0 | 5.32965 | 0 | [426, 581] | 0.10.0-DEV | 5 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231215_193921__455.json | 25.0 | 0.9 | missing | 0.3 | |
| 6217 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_233018__689 | 0 | 0.0 | 20.0435 | 0 | [107, 571] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_233018__689.json | 0.0 | missing | missing | missing | |
| 6218 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_233032__231 | 0 | 0.0 | 14.7181 | 0 | [1, 446] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_233032__231.json | 0.0 | missing | missing | missing | |
| 6219 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_233047__718 | 0 | 0.0 | 14.616 | 0 | [1, 443] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_233047__718.json | 0.0 | missing | missing | missing | |
| 6220 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_180529__456 | 0 | 0.0 | 16.4808 | 0 | [117, 409] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_180529__456.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6221 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_180540__568 | 0 | 0.0 | 11.6554 | 0 | [117, 286] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_180540__568.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6222 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_232935__450 | 0 | 0.0 | 10.0142 | 0 | [1, 310] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_232935__450.json | 0.0 | missing | missing | missing | |
| 6223 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_232958__191 | 0 | 0.0 | 22.2451 | 0 | [1, 651] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_232958__191.json | 0.0 | missing | missing | missing | |
| 6224 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_180456__719 | 1 | 0.0 | 15.0538 | 1 | [120, 373] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_180456__719.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6225 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231225_180512__275 | 0 | 0.0 | 16.2533 | 0 | [120, 403] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_180512__275.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6226 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231227_000929__462 | 0 | 0.0 | 13.4852 | 0 | [120, 332] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_000929__462.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6227 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_232850__215 | 1 | 0.0 | 12.2316 | 1 | [1, 370] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_232850__215.json | 60.0 | missing | missing | missing | |
| 6228 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_232902__585 | 1 | 0.0 | 12.0173 | 1 | [1, 364] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_232902__585.json | 60.0 | missing | missing | missing | |
| 6229 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_180430__857 | 0 | 0.0 | 10.8148 | 0 | [161, 255] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_180430__857.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6230 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_180441__972 | 0 | 0.0 | 10.32 | 0 | [161, 243] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_180441__972.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6231 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_000916__411 | 0 | 0.0 | 12.8679 | 0 | [161, 307] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_000916__411.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6232 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231219_232806__214 | 1 | 0.0 | 11.2386 | 1 | [1, 326] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_232806__214.json | 60.0 | missing | missing | missing | |
| 6233 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_232818__122 | 0 | 0.0 | 11.4216 | 0 | [1, 331] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_232818__122.json | 0.0 | missing | missing | missing | |
| 6234 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_180401__953 | 0 | 0.0 | 21.9271 | 0 | [308, 373] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_180401__953.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6235 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_180419__841 | 1 | 0.0 | 18.1683 | 1 | [308, 418] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_180419__841.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6236 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_000903__300 | 0 | 0.0 | 38.8534 | 0 | [308, 785] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_000903__300.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6237 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_233332__996 | 0 | 0.0 | 28.7447 | 0 | [1, 752] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_233332__996.json | 0.0 | missing | missing | missing | |
| 6238 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_233356__334 | 0 | 0.0 | 23.5894 | 0 | [1, 629] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_233356__334.json | 0.0 | missing | missing | missing | |
| 6239 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_180641__480 | 1 | 0.0 | 17.3299 | 1 | [428, 374] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_180641__480.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6240 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_180704__912 | 0 | 0.0 | 23.2448 | 0 | [428, 519] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_180704__912.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6241 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_001032__574 | 0 | 0.0 | 46.1297 | 0 | [428, 1047] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_001032__574.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6242 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_233203__922 | 0 | 0.0 | 49.1653 | 0 | [1, 1206] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_233203__922.json | 0.0 | missing | missing | missing | |
| 6243 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_233237__756 | 0 | 0.0 | 33.2246 | 0 | [1, 857] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_233237__756.json | 0.0 | missing | missing | missing | |
| 6244 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_180556__330 | 0 | 0.0 | 15.8858 | 0 | [426, 339] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_180556__330.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6245 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_180623__499 | 0 | 0.0 | 26.9476 | 0 | [426, 608] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_180623__499.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6246 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_000945__784 | 0 | 0.0 | 15.8975 | 0 | [426, 340] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_000945__784.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6247 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | false | false | 5 | 20231227_232730__414 | 0 | 0.0 | 11.5068 | 0 | [119, 357] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_232730__414.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6248 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | false | false | 5 | 20231227_232749__256 | 0 | 0.0 | 19.5137 | 0 | [119, 608] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_232749__256.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6249 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_232804__610 | 0 | 0.0 | 14.844 | 0 | [119, 463] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_232804__610.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6250 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_232817__598 | 4 | 0.0 | 12.4317 | 5 | [119, 387] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_232817__598.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6251 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_232836__531 | 0 | 0.0 | 18.1831 | 0 | [119, 567] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_232836__531.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6252 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_232645__534 | 0 | 0.0 | 7.35757 | 0 | [160, 218] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_232645__534.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6253 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_232654__558 | 0 | 0.0 | 8.37693 | 0 | [160, 251] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_232654__558.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6254 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_232702__196 | 0 | 0.0 | 8.71117 | 0 | [160, 262] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_232702__196.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6255 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_232708__339 | 0 | 0.0 | 6.12162 | 0 | [160, 178] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_232708__339.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6256 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_232718__729 | 0 | 0.0 | 9.59953 | 0 | [160, 290] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_232718__729.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6257 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_232525__878 | 1 | 0.0 | 13.7538 | 1 | [307, 373] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_232525__878.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6258 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_232546__364 | 0 | 0.0 | 20.0305 | 0 | [307, 582] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_232546__364.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6259 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_232602__748 | 0 | 0.0 | 16.1634 | 0 | [307, 462] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_232602__748.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6260 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_232619__458 | 0 | 0.0 | 17.3022 | 0 | [307, 501] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_232619__458.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6261 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_232638__504 | 0 | 0.0 | 18.4443 | 0 | [307, 534] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_232638__504.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6262 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_233035__412 | 1 | 0.0 | 20.6368 | 1 | [427, 573] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_233035__412.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6263 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_233053__307 | 0 | 0.0 | 17.0421 | 0 | [427, 466] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_233053__307.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6264 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_233104__897 | 0 | 0.0 | 10.7946 | 0 | [427, 277] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_233104__897.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6265 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_233118__722 | 0 | 0.0 | 14.6465 | 0 | [427, 395] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_233118__722.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6266 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_233141__665 | 0 | 0.0 | 23.0017 | 0 | [427, 643] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_233141__665.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6267 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_232851__923 | 0 | 0.0 | 15.3954 | 0 | [425, 418] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_232851__923.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6268 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_232905__956 | 0 | 0.0 | 14.021 | 0 | [425, 376] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_232905__956.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6269 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_232931__453 | 0 | 0.0 | 25.6321 | 0 | [425, 719] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_232931__453.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6270 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_232952__493 | 0 | 0.0 | 20.7642 | 0 | [425, 578] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_232952__493.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6271 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_233015__144 | 0 | 0.0 | 23.1223 | 0 | [425, 646] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_233015__144.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6272 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_233434__325 | 0 | 0.0 | 12.4716 | 0 | [119, 305] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_233434__325.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6273 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_233449__351 | 0 | 0.0 | 15.5926 | 0 | [119, 384] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_233449__351.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6274 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231227_233505__619 | 0 | 0.0 | 16.1424 | 0 | [119, 398] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_233505__619.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6275 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_233519__748 | 0 | 0.0 | 14.0446 | 0 | [119, 345] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_233519__748.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6276 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_233539__787 | 0 | 0.0 | 19.6661 | 0 | [119, 486] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_233539__787.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6277 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_233346__561 | 0 | 0.0 | 5.45156 | 0 | [160, 120] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_233346__561.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6278 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_233357__795 | 0 | 0.0 | 11.4513 | 0 | [160, 274] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_233357__795.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6279 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_233403__638 | 0 | 0.0 | 5.72493 | 0 | [160, 127] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_233403__638.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6280 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_233409__849 | 0 | 0.0 | 6.29588 | 0 | [160, 142] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_233409__849.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6281 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_233421__315 | 0 | 0.0 | 11.5734 | 0 | [160, 277] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_233421__315.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6282 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_233204__619 | 0 | 0.0 | 23.0643 | 0 | [307, 515] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_233204__619.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6283 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_233220__885 | 0 | 0.0 | 15.6309 | 0 | [307, 353] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_233220__885.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6284 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_233247__469 | 0 | 0.0 | 26.6227 | 0 | [307, 620] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_233247__469.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6285 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_233308__549 | 1 | 0.0 | 21.672 | 1 | [307, 501] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_233308__549.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6286 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_233340__444 | 0 | 0.0 | 31.8406 | 0 | [307, 744] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_233340__444.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6287 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_233807__520 | 0 | 0.0 | 17.1494 | 0 | [427, 369] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_233807__520.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6288 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_233823__705 | 0 | 0.0 | 16.1236 | 0 | [427, 344] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_233823__705.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6289 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_233851__681 | 0 | 0.0 | 27.801 | 0 | [427, 624] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_233851__681.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6290 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_233908__998 | 1 | 0.0 | 17.6026 | 1 | [427, 380] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_233908__998.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6291 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_233934__350 | 0 | 0.0 | 26.0185 | 0 | [427, 582] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_233934__350.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6292 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_233607__168 | 1 | 0.0 | 27.6262 | 1 | [425, 620] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_233607__168.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6293 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_233624__487 | 0 | 0.0 | 16.3862 | 0 | [425, 350] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_233624__487.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6294 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_233642__305 | 0 | 0.0 | 18.0215 | 0 | [425, 390] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_233642__305.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6295 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_233718__288 | 0 | 0.0 | 36.3549 | 0 | [425, 823] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_233718__288.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6296 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_233749__660 | 0 | 0.0 | 31.3384 | 0 | [425, 707] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_233749__660.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6297 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | true | false | 5 | 20231226_121656__487 | 0 | 0.0 | 24.3026 | 0 | [116, 444] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_121656__487.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6298 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | true | true | 5 | 20231226_121714__437 | 0 | 0.0 | 18.0611 | 0 | [116, 328] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_121714__437.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6299 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_121612__690 | 0 | 0.0 | 24.2121 | 0 | [119, 442] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_121612__690.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6300 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231226_121632__295 | 0 | 0.0 | 19.797 | 0 | [119, 357] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_121632__295.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6301 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_001435__149 | 0 | 0.0 | 16.1957 | 0 | [119, 292] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231227_001435__149.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6302 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_121529__657 | 0 | 0.0 | 7.85832 | 0 | [160, 132] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_121529__657.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6303 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_121548__209 | 0 | 0.0 | 17.874 | 0 | [160, 320] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_121548__209.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6304 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_001419__637 | 0 | 0.0 | 8.13928 | 0 | [160, 137] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_001419__637.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6305 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_121457__546 | 0 | 0.0 | 19.1497 | 0 | [307, 324] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_121457__546.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6306 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_121521__772 | 0 | 0.0 | 23.6948 | 0 | [307, 407] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_121521__772.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6307 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_001411__924 | 0 | 0.0 | 41.7223 | 1 | [307, 573] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_001411__924.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6308 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_121906__891 | 0 | 0.0 | 38.7066 | 0 | [427, 658] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_121906__891.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6309 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231226_121939__595 | 0 | 0.0 | 33.1704 | 0 | [427, 560] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_121939__595.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6310 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_001537__404 | 0 | 0.0 | 24.0853 | 0 | [427, 396] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_001537__404.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6311 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_121747__678 | 0 | 0.0 | 32.5271 | 0 | [425, 549] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_121747__678.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6312 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_121827__743 | 0 | 0.0 | 40.0955 | 0 | [425, 683] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_121827__743.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6313 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_001513__901 | 0 | 0.0 | 37.4031 | 0 | [425, 633] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_001513__901.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6314 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_113356__534 | 1 | 0.0 | 75.0369 | 1 | [123, 438] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_113356__534.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6315 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_113523__575 | 0 | 0.0 | 86.4353 | 0 | [123, 506] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_113523__575.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6316 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_113623__730 | 1 | 0.0 | 59.4673 | 1 | [123, 345] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_113623__730.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6317 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_150553__665 | 1 | 0.0 | 99.5454 | 1 | [123, 581] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_150553__665.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6318 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_150710__959 | 1 | 0.0 | 76.665 | 1 | [123, 446] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_150710__959.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6319 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_113139__445 | 1 | 0.0 | 63.7852 | 1 | [162, 361] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_113139__445.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6320 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_113204__941 | 0 | 0.0 | 24.1518 | 0 | [162, 121] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_113204__941.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6321 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_113241__663 | 0 | 0.0 | 37.4115 | 0 | [162, 202] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_113241__663.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6322 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_150321__206 | 1 | 0.0 | 27.0897 | 1 | [162, 138] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_150321__206.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6323 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_150413__935 | 0 | 0.0 | 52.1942 | 0 | [162, 290] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_150413__935.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6324 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_112734__236 | 1 | 0.0 | 110.792 | 1 | [313, 586] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_112734__236.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6325 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_112854__178 | 1 | 0.0 | 79.0073 | 1 | [313, 426] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_112854__178.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6326 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_113035__656 | 0 | 0.0 | 101.556 | 0 | [313, 557] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_113035__656.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6327 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_150034__588 | 0 | 0.0 | 123.306 | 1 | [313, 679] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_150034__588.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6328 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_150253__142 | 1 | 0.0 | 138.956 | 1 | [313, 767] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_150253__142.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6329 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_114041__402 | 0 | 0.0 | 33.249 | 0 | [436, 132] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_114041__402.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6330 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_114239__445 | 1 | 0.0 | 118.598 | 1 | [436, 627] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_114239__445.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6331 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_114428__896 | 1 | 0.0 | 108.713 | 1 | [436, 571] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_114428__896.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6332 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_151151__131 | 1 | 0.0 | 86.9161 | 1 | [436, 444] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_151151__131.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6333 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_151312__192 | 4 | 0.0 | 80.0327 | 5 | [436, 404] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_151312__192.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6334 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_113821__130 | 0 | 0.0 | 118.749 | 0 | [434, 628] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_113821__130.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6335 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_113923__606 | 4 | 0.0 | 61.8981 | 5 | [434, 301] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_113923__606.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6336 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_114008__725 | 1 | 0.0 | 43.8481 | 1 | [434, 195] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_114008__725.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6337 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_151024__562 | 0 | 0.0 | 44.2058 | 0 | [434, 195] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_151024__562.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6338 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_233641__163 | 0 | 0.0 | 17.4232 | 0 | [107, 500] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231219_233641__163.json | 0.0 | missing | missing | missing | |
| 6339 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_233657__766 | 0 | 0.0 | 15.5421 | 0 | [1, 469] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231219_233657__766.json | 0.0 | missing | missing | missing | |
| 6340 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_233714__668 | 0 | 0.0 | 17.4128 | 0 | [1, 521] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231219_233714__668.json | 0.0 | missing | missing | missing | |
| 6341 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_180854__506 | 0 | 0.0 | 11.7604 | 0 | [125, 287] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231225_180854__506.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6342 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_180910__799 | 0 | 0.0 | 16.1725 | 0 | [125, 400] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231225_180910__799.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6343 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_233603__628 | 0 | 0.0 | 15.8316 | 0 | [1, 477] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_233603__628.json | 0.0 | missing | missing | missing | |
| 6344 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_233624__727 | 0 | 0.0 | 21.2782 | 0 | [1, 625] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_233624__727.json | 0.0 | missing | missing | missing | |
| 6345 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_180826__120 | 1 | 0.0 | 18.1424 | 1 | [128, 450] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_180826__120.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6346 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_180842__179 | 1 | 0.0 | 15.5197 | 1 | [128, 383] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_180842__179.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6347 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_001122__716 | 0 | 0.0 | 12.1005 | 0 | [128, 296] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231227_001122__716.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6348 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_233519__803 | 0 | 0.0 | 10.7559 | 0 | [1, 328] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_233519__803.json | 0.0 | missing | missing | missing | |
| 6349 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_233529__356 | 0 | 0.0 | 10.0393 | 0 | [1, 307] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_233529__356.json | 0.0 | missing | missing | missing | |
| 6350 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_180757__848 | 0 | 0.0 | 9.85069 | 0 | [169, 230] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_180757__848.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6351 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_180808__640 | 0 | 0.0 | 10.9163 | 0 | [169, 257] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_180808__640.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6352 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_001110__571 | 0 | 0.0 | 8.85653 | 0 | [169, 204] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_001110__571.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6353 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_233441__733 | 0 | 0.0 | 18.8836 | 0 | [1, 530] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_233441__733.json | 25.0 | missing | missing | missing | |
| 6354 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_233456__286 | 0 | 0.0 | 14.8199 | 0 | [1, 423] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_233456__286.json | 0.0 | missing | missing | missing | |
| 6355 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_180729__193 | 0 | 0.0 | 24.8618 | 0 | [316, 425] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_180729__193.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6356 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_180747__261 | 0 | 0.0 | 17.0523 | 0 | [316, 389] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_180747__261.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6357 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_001101__240 | 0 | 0.0 | 29.219 | 0 | [316, 540] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_001101__240.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6358 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_233905__396 | 0 | 0.0 | 23.5968 | 0 | [1, 629] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_233905__396.json | 0.0 | missing | missing | missing | |
| 6359 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_233937__742 | 1 | 0.0 | 31.9501 | 1 | [1, 827] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_233937__742.json | 60.0 | missing | missing | missing | |
| 6360 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_181009__873 | 4 | 0.0 | 24.1206 | 5 | [436, 539] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_181009__873.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6361 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_181031__157 | 0 | 0.0 | 20.8991 | 0 | [436, 459] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_181031__157.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6362 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_001233__127 | 0 | 0.0 | 32.4517 | 0 | [436, 733] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_001233__127.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6363 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_233807__919 | 0 | 0.0 | 27.6954 | 0 | [1, 728] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_233807__919.json | 0.0 | missing | missing | missing | |
| 6364 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_233825__788 | 1 | 0.0 | 18.2046 | 1 | [1, 496] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_233825__788.json | 60.0 | missing | missing | missing | |
| 6365 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_180933__953 | 1 | 0.0 | 22.7427 | 1 | [434, 506] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_180933__953.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6366 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_180945__169 | 4 | 0.0 | 11.2036 | 5 | [434, 224] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_180945__169.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6367 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_001200__504 | 0 | 0.0 | 38.1619 | 0 | [434, 865] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_001200__504.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6368 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231214_004107__437 | 0 | 0.0 | 17.5334 | 0 | [107, 502] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231214_004107__437.json | 0.0 | missing | missing | missing | |
| 6369 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231225_165532__379 | 0 | 0.0 | 12.6429 | 0 | [123, 402] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231225_165532__379.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6370 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231225_165542__107 | 0 | 0.0 | 9.90655 | 0 | [123, 313] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231225_165542__107.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6371 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231214_004050__464 | 1 | 0.0 | 18.208 | 1 | [124, 518] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231214_004050__464.json | 60.0 | missing | missing | missing | |
| 6372 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231225_165506__865 | 1 | 0.0 | 9.55682 | 1 | [126, 301] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_165506__865.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6373 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231225_165519__279 | 0 | 0.0 | 13.2302 | 0 | [126, 420] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_165519__279.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6374 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231226_234315__817 | 1 | 0.0 | 9.52248 | 1 | [126, 298] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231226_234315__817.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6375 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_004031__803 | 1 | 0.0 | 13.9204 | 1 | [153, 386] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231214_004031__803.json | 60.0 | missing | missing | missing | |
| 6376 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_165447__515 | 1 | 0.0 | 13.2425 | 1 | [167, 409] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_165447__515.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6377 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_165456__355 | 1 | 0.0 | 7.83331 | 1 | [167, 234] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_165456__355.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6378 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_234306__420 | 1 | 0.0 | 11.4801 | 1 | [167, 351] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231226_234306__420.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6379 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_004017__321 | 0 | 0.0 | 19.1933 | 0 | [300, 470] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231214_004017__321.json | 25.0 | missing | missing | missing | |
| 6380 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_165418__321 | 0 | 0.0 | 20.5997 | 1 | [314, 454] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_165418__321.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6381 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_165434__925 | 1 | 0.0 | 15.9658 | 1 | [314, 470] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_165434__925.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6382 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_234254__531 | 4 | 0.0 | 17.7756 | 5 | [314, 369] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231226_234254__531.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6383 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_004152__451 | 1 | 0.0 | 23.106 | 1 | [11, 611] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231214_004152__451.json | 60.0 | missing | missing | missing | |
| 6384 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_165633__753 | 1 | 0.0 | 18.1781 | 1 | [434, 512] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_165633__753.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6385 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_165645__978 | 0 | 0.0 | 11.6872 | 0 | [434, 312] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_165645__978.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6386 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_234352__341 | 1 | 0.0 | 23.8949 | 1 | [434, 679] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231226_234352__341.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6387 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_004129__509 | 0 | 0.0 | 21.8459 | 0 | [424, 483] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231214_004129__509.json | 0.0 | missing | missing | missing | |
| 6388 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_165557__815 | 1 | 0.0 | 15.1881 | 1 | [432, 419] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_165557__815.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6389 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_165615__757 | 0 | 0.0 | 17.4134 | 0 | [432, 489] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_165615__757.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6390 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_234328__125 | 1 | 0.0 | 12.4309 | 1 | [432, 333] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231226_234328__125.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6391 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231214_005431__986 | 0 | 0.0 | 16.2656 | 0 | [107, 466] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__AsIs__1SHOT__20231214_005431__986.json | 0.0 | missing | missing | missing | |
| 6392 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231225_172851__931 | 0 | 0.0 | 5.24191 | 0 | [124, 80] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__AsIs__1SHOT__20231225_172851__931.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6393 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231225_172854__223 | 0 | 0.0 | 3.18049 | 0 | [124, 41] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__AsIs__1SHOT__20231225_172854__223.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6394 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231214_005415__854 | 1 | 0.0 | 16.6729 | 1 | [124, 476] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__InJulia__1SHOT__20231214_005415__854.json | 60.0 | missing | missing | missing | |
| 6395 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_172843__310 | 0 | 0.0 | 2.66047 | 0 | [127, 31] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__InJulia__1SHOT__20231225_172843__310.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6396 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_172846__823 | 0 | 0.0 | 2.98491 | 0 | [127, 37] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__InJulia__1SHOT__20231225_172846__823.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6397 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231226_235520__196 | 0 | 0.0 | 2.86239 | 0 | [127, 35] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__InJulia__1SHOT__20231226_235520__196.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6398 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_005358__999 | 0 | 0.0 | 12.1132 | 1 | [153, 335] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231214_005358__999.json | 55.0 | missing | missing | missing | |
| 6399 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_172825__259 | 0 | 0.0 | 6.55695 | 0 | [166, 95] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_172825__259.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6400 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_172840__145 | 0 | 0.0 | 14.9424 | 0 | [166, 251] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_172840__145.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6401 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_235517__689 | 0 | 0.0 | 21.1746 | 0 | [166, 367] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231226_235517__689.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6402 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_005346__576 | 0 | 0.0 | 17.173 | 0 | [300, 417] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231214_005346__576.json | 0.0 | missing | missing | missing | |
| 6403 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_172744__444 | 0 | 0.0 | 66.0561 | 0 | [313, 899] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_172744__444.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6404 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_172819__250 | 0 | 0.0 | 34.2486 | 0 | [313, 559] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_172819__250.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6405 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_235456__415 | 0 | 0.0 | 34.075 | 0 | [313, 410] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231226_235456__415.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6406 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_005520__633 | 0 | 0.0 | 18.6467 | 0 | [11, 500] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231214_005520__633.json | 0.0 | missing | missing | missing | |
| 6407 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_173147__898 | 0 | 0.0 | 58.5885 | 0 | [430, 927] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_173147__898.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6408 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_173304__854 | 0 | 0.0 | 76.9558 | 0 | [430, 1201] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_173304__854.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6409 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_235535__376 | 0 | 0.0 | 7.85653 | 0 | [430, 79] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231226_235535__376.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6410 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_005502__814 | 0 | 0.0 | 30.196 | 0 | [424, 689] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231214_005502__814.json | 0.0 | missing | missing | missing | |
| 6411 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_173005__830 | 0 | 0.0 | 70.1601 | 0 | [427, 1092] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_173005__830.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6412 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_173048__551 | 0 | 0.0 | 43.6886 | 0 | [427, 691] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_173048__551.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6413 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_235527__776 | 0 | 0.0 | 7.47177 | 0 | [427, 72] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231226_235527__776.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6414 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231219_234233__579 | 0 | 0.0 | 17.0071 | 0 | [107, 488] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231219_234233__579.json | 0.0 | missing | missing | missing | |
| 6415 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231219_234253__697 | 0 | 0.0 | 19.8752 | 0 | [1, 589] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231219_234253__697.json | 0.0 | missing | missing | missing | |
| 6416 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231219_234307__500 | 0 | 0.0 | 14.0445 | 0 | [1, 427] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231219_234307__500.json | 0.0 | missing | missing | missing | |
| 6417 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231225_181207__980 | 0 | 0.0 | 43.7893 | 0 | [112, 1512] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231225_181207__980.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6418 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231225_181210__137 | 0 | 0.0 | 3.54207 | 0 | [112, 129] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231225_181210__137.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6419 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231219_234156__950 | 0 | 0.0 | 18.9972 | 0 | [1, 564] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_234156__950.json | 0.0 | missing | missing | missing | |
| 6420 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231219_234216__698 | 0 | 0.0 | 19.4042 | 0 | [1, 575] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_234216__698.json | 0.0 | missing | missing | missing | |
| 6421 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_181104__343 | 0 | 0.0 | 5.25884 | 0 | [115, 196] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_181104__343.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6422 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_181123__434 | 0 | 0.0 | 19.2087 | 0 | [115, 713] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_181123__434.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6423 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_001306__886 | 0 | 0.0 | 22.3478 | 0 | [115, 821] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231227_001306__886.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6424 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_234106__563 | 0 | 0.0 | 11.5896 | 0 | [1, 352] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_234106__563.json | 0.0 | missing | missing | missing | |
| 6425 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_234116__953 | 1 | 0.0 | 10.1056 | 1 | [1, 309] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_234116__953.json | 60.0 | missing | missing | missing | |
| 6426 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_181047__942 | 0 | 0.0 | 4.61234 | 0 | [152, 166] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_181047__942.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6427 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_181058__711 | 1 | 0.0 | 10.9637 | 1 | [152, 409] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_181058__711.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6428 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_001244__926 | 0 | 0.0 | 6.50601 | 0 | [152, 239] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_001244__926.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6429 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_234009__506 | 0 | 0.0 | 18.3373 | 0 | [1, 516] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_234009__506.json | 25.0 | missing | missing | missing | |
| 6430 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_234036__119 | 0 | 0.0 | 27.4405 | 0 | [1, 745] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_234036__119.json | 0.0 | missing | missing | missing | |
| 6431 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_181036__159 | 0 | 0.0 | 4.79623 | 0 | [270, 3] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_181036__159.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6432 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_181043__798 | 0 | 0.0 | 6.72911 | 0 | [270, 227] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_181043__798.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6433 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_001238__360 | 0 | 0.0 | 4.44613 | 0 | [270, 5] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_001238__360.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6434 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_234513__364 | 0 | 0.0 | 25.9923 | 0 | [1, 687] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_234513__364.json | 0.0 | missing | missing | missing | |
| 6435 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_234539__401 | 0 | 0.0 | 26.4196 | 0 | [1, 697] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_234539__401.json | 25.0 | missing | missing | missing | |
| 6436 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_181256__982 | 0 | 0.0 | 17.8393 | 0 | [404, 594] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_181256__982.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6437 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_181308__208 | 0 | 0.0 | 11.9465 | 0 | [404, 390] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_181308__208.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6438 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_001329__780 | 0 | 0.0 | 9.53648 | 0 | [404, 304] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_001329__780.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6439 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_234415__646 | 0 | 0.0 | 32.0753 | 0 | [1, 831] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_234415__646.json | 0.0 | missing | missing | missing | |
| 6440 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_234433__682 | 0 | 0.0 | 17.8564 | 0 | [1, 487] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_234433__682.json | 0.0 | missing | missing | missing | |
| 6441 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_181223__231 | 0 | 0.0 | 12.5273 | 0 | [401, 411] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_181223__231.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6442 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_181238__195 | 0 | 0.0 | 15.5757 | 0 | [401, 517] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_181238__195.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6443 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_001320__314 | 0 | 0.0 | 13.165 | 0 | [401, 432] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_001320__314.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6444 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231214_005632__230 | 0 | 0.0 | 22.077 | 0 | [107, 624] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231214_005632__230.json | 0.0 | missing | missing | missing | |
| 6445 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | AsIs | 1SHOT | true | true | 5 | 20231225_173903__929 | 4 | 0.0 | 45.5538 | 5 | [132, 344] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_173903__929.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6446 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231225_174003__759 | 0 | 0.0 | 60.1909 | 0 | [132, 460] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_174003__759.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6447 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231214_005610__117 | 1 | 0.0 | 18.8931 | 1 | [124, 537] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231214_005610__117.json | 60.0 | missing | missing | missing | |
| 6448 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_173720__374 | 0 | 0.0 | 39.386 | 0 | [135, 294] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_173720__374.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6449 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_173817__972 | 5 | 0.0 | 57.1932 | 5 | [135, 436] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_173817__972.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6450 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231226_235827__869 | 5 | 0.0 | 31.0459 | 5 | [135, 225] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231226_235827__869.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6451 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_005550__659 | 1 | 0.0 | 13.4789 | 1 | [153, 374] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231214_005550__659.json | 60.0 | missing | missing | missing | |
| 6452 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_173547__954 | 1 | 0.0 | 56.8508 | 1 | [174, 412] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_173547__954.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6453 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_173641__971 | 5 | 0.0 | 52.5665 | 5 | [174, 392] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_173641__971.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6454 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_235755__181 | 2 | 0.0 | 51.5663 | 5 | [174, 381] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231226_235755__181.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6455 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_005537__435 | 1 | 0.0 | 16.3369 | 1 | [300, 394] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_005537__435.json | 60.0 | missing | missing | missing | |
| 6456 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_173420__725 | 3 | 0.0 | 76.0186 | 5 | [321, 357] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_173420__725.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6457 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_173451__127 | 0 | 0.0 | 29.969 | 0 | [321, 178] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_173451__127.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6458 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_235704__831 | 4 | 0.0 | 88.3907 | 5 | [321, 476] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_235704__831.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6459 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_005717__118 | 0 | 0.0 | 22.4306 | 0 | [11, 593] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_005717__118.json | 0.0 | missing | missing | missing | |
| 6460 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_174240__600 | 5 | 0.0 | 56.5199 | 5 | [438, 372] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_174240__600.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6461 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_174328__870 | 1 | 0.0 | 48.7537 | 1 | [438, 312] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_174328__870.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6462 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_235906__726 | 0 | 0.0 | 9.69317 | 0 | [438, 5] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_235906__726.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6463 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_005655__751 | 0 | 0.0 | 22.542 | 0 | [424, 499] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231214_005655__751.json | 0.0 | missing | missing | missing | |
| 6464 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_174046__719 | 1 | 0.0 | 42.6473 | 5 | [435, 265] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_174046__719.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6465 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_174143__819 | 4 | 0.0 | 56.3974 | 4 | [435, 371] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_174143__819.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6466 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_235856__790 | 4 | 0.0 | 29.1137 | 5 | [435, 159] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231226_235856__790.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6467 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_232413__131 | 0 | 0.0 | 21.8505 | 0 | [107, 620] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231219_232413__131.json | 0.0 | missing | missing | missing | |
| 6468 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_232431__905 | 0 | 0.0 | 18.3168 | 0 | [1, 546] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231219_232431__905.json | 0.0 | missing | missing | missing | |
| 6469 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_232456__624 | 0 | 0.0 | 24.065 | 0 | [1, 699] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231219_232456__624.json | 0.0 | missing | missing | missing | |
| 6470 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_180135__640 | 0 | 0.0 | 25.6228 | 0 | [125, 428] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231225_180135__640.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6471 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_180156__791 | 0 | 0.0 | 20.8663 | 0 | [125, 347] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231225_180156__791.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6472 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231219_232336__240 | 1 | 0.0 | 15.4543 | 1 | [1, 466] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_232336__240.json | 60.0 | missing | missing | missing | |
| 6473 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_232351__643 | 0 | 0.0 | 15.1844 | 0 | [1, 459] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_232351__643.json | 0.0 | missing | missing | missing | |
| 6474 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_180010__271 | 0 | 0.0 | 30.5155 | 5 | [128, 511] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_180010__271.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6475 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_180109__434 | 0 | 0.0 | 58.5187 | 5 | [128, 967] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_180109__434.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6476 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_000724__983 | 1 | 0.0 | 24.3558 | 1 | [128, 404] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231227_000724__983.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6477 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_232250__286 | 0 | 0.0 | 7.98963 | 0 | [1, 247] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_232250__286.json | 0.0 | missing | missing | missing | |
| 6478 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_232304__740 | 1 | 0.0 | 13.5668 | 1 | [1, 408] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_232304__740.json | 60.0 | missing | missing | missing | |
| 6479 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_175928__376 | 1 | 0.0 | 10.3057 | 5 | [169, 155] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_175928__376.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6480 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_175940__138 | 1 | 0.0 | 11.5181 | 1 | [169, 176] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_175940__138.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6481 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_000659__332 | 1 | 0.0 | 11.4472 | 1 | [169, 175] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_000659__332.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6482 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_232211__212 | 0 | 0.0 | 14.5891 | 0 | [1, 417] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_232211__212.json | 0.0 | missing | missing | missing | |
| 6483 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_232228__180 | 0 | 0.0 | 17.7035 | 0 | [1, 499] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_232228__180.json | 0.0 | missing | missing | missing | |
| 6484 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_175849__798 | 0 | 0.0 | 37.7977 | 0 | [316, 451] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_175849__798.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6485 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_175917__939 | 0 | 0.0 | 28.5244 | 0 | [316, 443] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_175917__939.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6486 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_000647__398 | 0 | 0.0 | 35.9159 | 0 | [316, 422] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_000647__398.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6487 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_232703__118 | 0 | 0.0 | 13.3089 | 0 | [1, 369] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_232703__118.json | 0.0 | missing | missing | missing | |
| 6488 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_232736__871 | 1 | 0.0 | 33.1367 | 1 | [1, 854] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_232736__871.json | 60.0 | missing | missing | missing | |
| 6489 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_180317__924 | 1 | 0.0 | 26.4295 | 1 | [436, 386] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_180317__924.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6490 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_180339__299 | 1 | 0.0 | 22.0969 | 1 | [436, 315] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_180339__299.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6491 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_000823__383 | 1 | 0.0 | 29.0012 | 1 | [436, 428] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_000823__383.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6492 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_232555__135 | 0 | 0.0 | 25.6916 | 0 | [1, 680] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_232555__135.json | 0.0 | missing | missing | missing | |
| 6493 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_232629__961 | 1 | 0.0 | 34.0013 | 1 | [1, 875] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_232629__961.json | 60.0 | missing | missing | missing | |
| 6494 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_180218__946 | 1 | 0.0 | 22.5102 | 1 | [434, 322] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_180218__946.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6495 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_180250__967 | 1 | 0.0 | 31.5511 | 1 | [434, 470] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_180250__967.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6496 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_000754__777 | 1 | 0.0 | 29.35 | 1 | [434, 434] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_000754__777.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6497 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231214_005239__283 | 0 | 0.0 | 18.7086 | 0 | [107, 534] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__AsIs__1SHOT__20231214_005239__283.json | 0.0 | missing | missing | missing | |
| 6498 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231225_172534__413 | 0 | 0.0 | 6.20525 | 0 | [123, 331] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_172534__413.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6499 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231225_172540__837 | 0 | 0.0 | 5.6391 | 0 | [123, 302] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_172540__837.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6500 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | InJulia | 1SHOT | true | true | 5 | 20231214_005220__524 | 0 | 0.0 | 12.9722 | 1 | [124, 371] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__InJulia__1SHOT__20231214_005220__524.json | 55.0 | missing | missing | missing | |
| 6501 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231225_172521__633 | 0 | 0.0 | 17.7103 | 0 | [126, 904] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_172521__633.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6502 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231225_172528__391 | 0 | 0.0 | 7.04211 | 0 | [126, 384] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_172528__391.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6503 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | InJulia | 1SHOT | true | false | 5 | 20231226_235401__979 | 0 | 0.0 | 9.67895 | 0 | [126, 528] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__InJulia__1SHOT__20231226_235401__979.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6504 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_005207__329 | 1 | 0.0 | 17.9101 | 1 | [153, 498] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231214_005207__329.json | 60.0 | missing | missing | missing | |
| 6505 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_172454__899 | 0 | 0.0 | 7.02696 | 0 | [163, 359] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_172454__899.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6506 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_172503__390 | 0 | 0.0 | 8.7292 | 0 | [163, 446] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_172503__390.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6507 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_235351__638 | 0 | 0.0 | 8.56118 | 0 | [163, 456] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231226_235351__638.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6508 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_005149__241 | 0 | 0.0 | 24.9243 | 0 | [300, 618] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231214_005149__241.json | 0.0 | missing | missing | missing | |
| 6509 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_172440__490 | 0 | 0.0 | 14.4275 | 0 | [278, 571] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_172440__490.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6510 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_172447__970 | 0 | 0.0 | 7.23043 | 0 | [278, 356] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_172447__970.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6511 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_235342__569 | 0 | 0.0 | 10.0353 | 0 | [278, 365] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231226_235342__569.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6512 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_005329__722 | 0 | 0.0 | 23.7351 | 0 | [11, 626] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231214_005329__722.json | 0.0 | missing | missing | missing | |
| 6513 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_172625__866 | 1 | 0.0 | 10.9192 | 1 | [413, 487] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_172625__866.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6514 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_172638__552 | 0 | 0.0 | 12.92 | 0 | [413, 591] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_172638__552.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6515 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_235422__684 | 1 | 0.0 | 12.8238 | 1 | [413, 592] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231226_235422__684.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6516 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_005305__630 | 0 | 0.0 | 25.7998 | 0 | [424, 582] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231214_005305__630.json | 0.0 | missing | missing | missing | |
| 6517 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_172557__137 | 0 | 0.0 | 17.8889 | 0 | [411, 797] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_172557__137.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6518 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_172614__689 | 0 | 0.0 | 16.7461 | 0 | [411, 755] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_172614__689.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6519 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | false | 5 | 20231226_235409__188 | 0 | 0.0 | 8.30775 | 0 | [411, 375] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231226_235409__188.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6520 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231214_004256__595 | 0 | 0.0 | 21.2526 | 0 | [107, 603] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__AsIs__1SHOT__20231214_004256__595.json | 0.0 | missing | missing | missing | |
| 6521 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231225_165832__746 | 0 | 0.0 | 11.4145 | 0 | [125, 361] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__AsIs__1SHOT__20231225_165832__746.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6522 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231225_165842__733 | 0 | 0.0 | 9.9421 | 0 | [125, 313] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__AsIs__1SHOT__20231225_165842__733.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6523 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | InJulia | 1SHOT | false | false | 5 | 20231214_004235__581 | 0 | 0.0 | 15.322 | 0 | [124, 438] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__InJulia__1SHOT__20231214_004235__581.json | 0.0 | missing | missing | missing | |
| 6524 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_165806__662 | 1 | 0.0 | 16.2506 | 1 | [128, 516] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_165806__662.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6525 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_165820__520 | 0 | 0.0 | 13.6783 | 0 | [128, 433] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_165820__520.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6526 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231226_234439__145 | 1 | 0.0 | 12.605 | 1 | [128, 398] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__InJulia__1SHOT__20231226_234439__145.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6527 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_004219__757 | 1 | 0.0 | 12.47 | 1 | [153, 345] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231214_004219__757.json | 60.0 | missing | missing | missing | |
| 6528 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_165734__479 | 0 | 0.0 | 10.4342 | 0 | [169, 318] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_165734__479.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6529 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_165749__342 | 1 | 0.0 | 15.0129 | 1 | [169, 464] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_165749__342.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6530 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_234426__466 | 1 | 0.0 | 10.9511 | 1 | [169, 334] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231226_234426__466.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6531 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_004207__699 | 0 | 0.0 | 14.2614 | 0 | [300, 338] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231214_004207__699.json | 0.0 | missing | missing | missing | |
| 6532 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_165705__187 | 0 | 0.0 | 20.4789 | 0 | [316, 442] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_165705__187.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6533 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_165724__419 | 0 | 0.0 | 18.3347 | 0 | [316, 542] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_165724__419.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6534 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_234415__188 | 0 | 0.0 | 22.9606 | 0 | [316, 522] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231226_234415__188.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6535 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_004340__484 | 3 | 0.0 | 21.0981 | 5 | [11, 561] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231214_004340__484.json | 90.0 | missing | missing | missing | |
| 6536 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_165925__835 | 0 | 0.0 | 11.342 | 0 | [436, 300] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_165925__835.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6537 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_165953__830 | 0 | 0.0 | 28.1929 | 1 | [436, 807] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_165953__830.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6538 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_234508__422 | 1 | 0.0 | 16.5019 | 1 | [436, 457] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231226_234508__422.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6539 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_004318__627 | 0 | 0.0 | 22.4185 | 0 | [424, 497] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231214_004318__627.json | 0.0 | missing | missing | missing | |
| 6540 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_165905__595 | 0 | 0.0 | 22.6802 | 0 | [434, 645] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_165905__595.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6541 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_165913__771 | 0 | 0.0 | 8.44353 | 1 | [434, 208] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_165913__771.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6542 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_234452__747 | 1 | 0.0 | 12.5569 | 1 | [434, 336] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231226_234452__747.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6543 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231214_004450__810 | 0 | 0.0 | 21.0213 | 0 | [107, 596] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__AsIs__1SHOT__20231214_004450__810.json | 0.0 | missing | missing | missing | |
| 6544 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | AsIs | 1SHOT | true | true | 5 | 20231225_170656__687 | 1 | 0.0 | 54.7849 | 1 | [120, 405] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__AsIs__1SHOT__20231225_170656__687.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6545 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | AsIs | 1SHOT | true | false | 5 | 20231225_170808__856 | 0 | 0.0 | 70.9031 | 0 | [120, 529] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__AsIs__1SHOT__20231225_170808__856.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6546 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231214_004429__412 | 1 | 0.0 | 19.904 | 1 | [124, 564] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__InJulia__1SHOT__20231214_004429__412.json | 60.0 | missing | missing | missing | |
| 6547 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231225_170458__552 | 0 | 0.0 | 79.645 | 0 | [123, 594] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_170458__552.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6548 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231225_170601__548 | 0 | 0.0 | 63.3382 | 0 | [123, 468] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_170601__548.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6549 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231226_234907__469 | 0 | 0.0 | 100.992 | 0 | [123, 748] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__InJulia__1SHOT__20231226_234907__469.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6550 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_004409__679 | 1 | 0.0 | 9.86623 | 1 | [153, 269] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231214_004409__679.json | 60.0 | missing | missing | missing | |
| 6551 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_170243__649 | 1 | 0.0 | 41.6704 | 1 | [162, 296] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_170243__649.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6552 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_170338__439 | 0 | 0.0 | 53.6046 | 0 | [162, 387] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_170338__439.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6553 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_234726__465 | 0 | 0.0 | 40.1088 | 0 | [162, 283] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231226_234726__465.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6554 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_004358__969 | 0 | 0.0 | 18.7159 | 0 | [300, 457] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231214_004358__969.json | 25.0 | missing | missing | missing | |
| 6555 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_170100__818 | 1 | 0.0 | 66.6179 | 1 | [313, 280] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_170100__818.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6556 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_170202__346 | 0 | 0.0 | 61.0314 | 0 | [313, 415] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_170202__346.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6557 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_234646__682 | 0 | 0.0 | 97.3872 | 0 | [313, 516] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231226_234646__682.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6558 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_004536__187 | 1 | 0.0 | 19.6626 | 1 | [11, 525] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231214_004536__187.json | 60.0 | missing | missing | missing | |
| 6559 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_171229__908 | 0 | 0.0 | 85.7912 | 0 | [436, 568] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_171229__908.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6560 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_171238__680 | 0 | 0.0 | 9.69164 | 0 | [436, 4] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_171238__680.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6561 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_235029__204 | 0 | 0.0 | 16.7411 | 0 | [436, 58] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231226_235029__204.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6562 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_004516__993 | 0 | 0.0 | 26.3958 | 0 | [424, 597] | 0.10.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231214_004516__993.json | 0.0 | missing | missing | missing | |
| 6563 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_170951__786 | 0 | 0.0 | 102.185 | 0 | [434, 684] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_170951__786.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6564 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_171103__565 | 0 | 0.0 | 71.4798 | 0 | [434, 465] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_171103__565.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6565 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_235012__991 | 0 | 0.0 | 63.985 | 0 | [434, 410] | 0.10.0-DEV | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231226_235012__991.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6566 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231214_010951__116 | 0 | 0.0 | 37.7845 | 0 | [72, 297] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231214_010951__116.json | 0.0 | missing | missing | missing | |
| 6567 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231225_142526__903 | 0 | 0.0 | 9.12034 | 0 | [94, 160] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_142526__903.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6568 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | AsIs | 1SHOT | true | false | 5 | 20231225_142536__339 | 0 | 0.0 | 9.21741 | 0 | [94, 162] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_142536__339.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6569 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231214_010913__674 | 0 | 0.0 | 67.7251 | 0 | [89, 548] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231214_010913__674.json | 50.0 | missing | missing | missing | |
| 6570 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_142511__732 | 0 | 0.0 | 23.315 | 0 | [97, 423] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_142511__732.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6571 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | InJulia | 1SHOT | false | false | 5 | 20231225_142517__950 | 0 | 0.0 | 6.3899 | 0 | [97, 104] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_142517__950.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6572 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231227_002344__284 | 1 | 0.0 | 5.80949 | 1 | [97, 92] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231227_002344__284.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6573 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_010805__240 | 0 | 0.0 | 47.798 | 0 | [118, 377] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231214_010805__240.json | 0.0 | missing | missing | missing | |
| 6574 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_142442__217 | 0 | 0.0 | 14.8807 | 0 | [135, 260] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_142442__217.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6575 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_142447__656 | 0 | 0.0 | 5.02571 | 0 | [135, 72] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_142447__656.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6576 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_002339__118 | 1 | 0.0 | 6.11357 | 1 | [135, 93] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231227_002339__118.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6577 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_010717__587 | 0 | 0.0 | 65.1538 | 0 | [211, 509] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231214_010717__587.json | 50.0 | missing | missing | missing | |
| 6578 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_142413__167 | 0 | 0.0 | 25.2016 | 1 | [229, 253] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_142413__167.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6579 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_142427__660 | 1 | 0.0 | 14.5692 | 1 | [229, 236] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_142427__660.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6580 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_002332__865 | 0 | 0.0 | 24.8174 | 1 | [229, 252] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231227_002332__865.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6581 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_011155__641 | 0 | 0.0 | 60.4912 | 0 | [11, 467] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231214_011155__641.json | 50.0 | missing | missing | missing | |
| 6582 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_142631__769 | 1 | 0.0 | 16.5486 | 1 | [400, 242] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_142631__769.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6583 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_142649__487 | 1 | 0.0 | 18.0384 | 1 | [400, 269] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_142649__487.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6584 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_002426__776 | 0 | 0.0 | 16.97 | 0 | [400, 245] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231227_002426__776.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6585 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_011055__712 | 0 | 0.0 | 64.1534 | 0 | [389, 404] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231214_011055__712.json | 0.0 | missing | missing | missing | |
| 6586 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_142558__838 | 1 | 0.0 | 22.2302 | 2 | [397, 344] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_142558__838.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 6587 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_142614__308 | 0 | 0.0 | 16.5594 | 0 | [397, 243] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_142614__308.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6588 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_002408__254 | 0 | 0.0 | 23.1266 | 0 | [397, 358] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231227_002408__254.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6589 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_011614__365 | 0 | 0.0 | 1.8367 | 0 | [0, 138] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_011614__365.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6590 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_011617__411 | 0 | 0.0 | 2.34945 | 0 | [0, 179] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_011617__411.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6591 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_011620__705 | 1 | 0.0 | 3.02811 | 1 | [0, 230] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_011620__705.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6592 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_011625__498 | 0 | 0.0 | 4.84455 | 0 | [0, 367] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_011625__498.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6593 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_011629__506 | 1 | 0.0 | 4.18392 | 1 | [0, 318] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_011629__506.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6594 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_011553__474 | 0 | 0.0 | 3.72863 | 0 | [0, 270] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_011553__474.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6595 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240201_011556__603 | 0 | 0.0 | 3.47975 | 0 | [0, 252] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_011556__603.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6596 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_011558__931 | 1 | 0.0 | 1.72767 | 1 | [0, 126] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_011558__931.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6597 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_011600__339 | 0 | 0.0 | 2.26955 | 0 | [0, 169] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_011600__339.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6598 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240201_011601__978 | 0 | 0.0 | 1.02654 | 0 | [0, 78] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_011601__978.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6599 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_011530__275 | 1 | 0.0 | 3.4693 | 1 | [0, 249] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_011530__275.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6600 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_011531__397 | 1 | 0.0 | 1.74548 | 1 | [0, 126] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_011531__397.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6601 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_011533__275 | 0 | 0.0 | 1.44728 | 0 | [0, 104] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_011533__275.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6602 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_011540__581 | 1 | 0.0 | 6.77368 | 2 | [0, 481] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_011540__581.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 6603 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_011542__426 | 0 | 0.0 | 2.22106 | 0 | [0, 160] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_011542__426.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6604 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_011738__653 | 0 | 0.0 | 4.48623 | 0 | [0, 331] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_011738__653.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6605 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_011741__784 | 0 | 0.0 | 2.68399 | 0 | [0, 193] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_011741__784.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6606 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240201_011746__198 | 0 | 0.0 | 4.7657 | 0 | [0, 343] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_011746__198.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6607 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_011750__382 | 0 | 0.0 | 4.06256 | 0 | [0, 293] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_011750__382.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6608 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_011752__214 | 0 | 0.0 | 1.83816 | 0 | [0, 133] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_011752__214.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6609 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_011655__218 | 1 | 0.0 | 3.73657 | 1 | [0, 276] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_011655__218.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6610 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_011658__845 | 0 | 0.0 | 2.97591 | 1 | [0, 215] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_011658__845.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6611 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_011703__661 | 0 | 0.0 | 5.16015 | 0 | [0, 377] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_011703__661.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6612 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_011705__165 | 0 | 0.0 | 1.77523 | 0 | [0, 131] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_011705__165.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6613 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_011707__942 | 0 | 0.0 | 1.51673 | 0 | [0, 112] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_011707__942.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6614 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231214_072123__338 | 0 | 0.0 | 22042.7 | 0 | [72, 442] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__AsIs__1SHOT__20231214_072123__338.json | 0.0 | missing | missing | missing | |
| 6615 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | AsIs | 1SHOT | true | true | 5 | 20231225_142820__930 | 0 | 0.0 | 7.638 | 0 | [68, 133] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__AsIs__1SHOT__20231225_142820__930.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6616 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | AsIs | 1SHOT | true | true | 5 | 20231225_142831__220 | 0 | 0.0 | 11.1361 | 0 | [68, 200] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__AsIs__1SHOT__20231225_142831__220.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6617 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | InJulia | 1SHOT | true | true | 5 | 20231214_011400__819 | 0 | 0.0 | 53.6548 | 0 | [89, 451] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__InJulia__1SHOT__20231214_011400__819.json | 50.0 | missing | missing | missing | |
| 6618 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_142804__566 | 0 | 0.0 | 1.56477 | 0 | [71, 15] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_142804__566.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6619 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | InJulia | 1SHOT | true | true | 5 | 20231225_142813__391 | 0 | 0.0 | 8.78628 | 0 | [71, 155] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_142813__391.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6620 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_011307__959 | 0 | 0.0 | 20.157 | 0 | [118, 167] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231214_011307__959.json | 0.0 | missing | missing | missing | |
| 6621 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_142740__622 | 0 | 0.0 | 27.433 | 0 | [72, 504] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_142740__622.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6622 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_142802__158 | 0 | 0.0 | 21.6767 | 0 | [72, 399] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_142802__158.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6623 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_011246__430 | 0 | 0.0 | 51.0405 | 0 | [211, 406] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231214_011246__430.json | 50.0 | missing | missing | missing | |
| 6624 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_142705__962 | 0 | 0.0 | 15.5559 | 0 | [104, 88] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_142705__962.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6625 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_142713__194 | 0 | 0.0 | 7.95893 | 0 | [104, 134] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_142713__194.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6626 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_072324__712 | 0 | 0.0 | 10.3802 | 0 | [11, 61] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231214_072324__712.json | 0.0 | missing | missing | missing | |
| 6627 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_142911__223 | 0 | 0.0 | 12.4744 | 0 | [89, 225] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_142911__223.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6628 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_142912__463 | 0 | 0.0 | 1.37346 | 0 | [89, 11] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_142912__463.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6629 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_072313__145 | 0 | 0.0 | 109.114 | 0 | [389, 538] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231214_072313__145.json | 0.0 | missing | missing | missing | |
| 6630 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_142841__175 | 0 | 0.0 | 9.80212 | 0 | [86, 174] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_142841__175.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6631 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_142858__692 | 0 | 0.0 | 16.6349 | 0 | [86, 304] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_142858__692.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6632 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_012156__908 | 1 | 0.0 | 3.05219 | 1 | [0, 110] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_012156__908.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6633 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_012201__909 | 1 | 0.0 | 4.13547 | 1 | [0, 149] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_012201__909.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6634 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_012208__554 | 1 | 0.0 | 6.9439 | 1 | [0, 250] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_012208__554.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6635 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_012212__737 | 0 | 0.0 | 3.53934 | 0 | [0, 128] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_012212__737.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6636 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20240201_012219__516 | 0 | 0.0 | 7.53932 | 0 | [0, 271] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_012219__516.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6637 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_012058__172 | 1 | 0.0 | 5.64919 | 1 | [0, 201] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_012058__172.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6638 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_012105__385 | 1 | 0.0 | 7.69801 | 1 | [0, 274] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_012105__385.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6639 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_012110__700 | 0 | 0.0 | 4.29223 | 0 | [0, 153] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_012110__700.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6640 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_012117__744 | 1 | 0.0 | 6.65784 | 1 | [0, 237] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_012117__744.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6641 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_012123__821 | 1 | 0.0 | 6.28329 | 2 | [0, 226] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_012123__821.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 6642 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_012021__631 | 1 | 0.0 | 3.21748 | 1 | [0, 116] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_012021__631.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6643 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_012023__998 | 0 | 0.0 | 1.9963 | 0 | [0, 72] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_012023__998.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6644 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_012026__831 | 0 | 0.0 | 2.96692 | 0 | [0, 107] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_012026__831.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6645 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_012029__744 | 0 | 0.0 | 2.21798 | 0 | [0, 80] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_012029__744.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6646 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_012031__780 | 0 | 0.0 | 2.02429 | 0 | [0, 73] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_012031__780.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6647 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_012348__113 | 0 | 0.0 | 12.5387 | 0 | [0, 444] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_012348__113.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6648 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_012354__283 | 1 | 0.0 | 6.27131 | 2 | [0, 223] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_012354__283.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 6649 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_012410__137 | 1 | 0.0 | 16.2548 | 2 | [0, 575] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_012410__137.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 6650 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_012415__897 | 0 | 0.0 | 4.79831 | 0 | [0, 170] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_012415__897.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6651 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_012426__628 | 0 | 0.0 | 10.6326 | 0 | [0, 375] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_012426__628.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6652 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_012246__765 | 1 | 0.0 | 5.04811 | 1 | [0, 180] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_012246__765.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6653 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20240201_012252__831 | 0 | 0.0 | 6.21264 | 0 | [0, 221] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_012252__831.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6654 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_012255__208 | 0 | 0.0 | 2.57552 | 0 | [0, 92] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_012255__208.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6655 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_012303__707 | 0 | 0.0 | 8.0822 | 0 | [0, 287] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_012303__707.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6656 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_012305__985 | 0 | 0.0 | 2.51653 | 0 | [0, 90] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_012305__985.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6657 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240201_011202__164 | 0 | 0.0 | 4.48982 | 0 | [0, 109] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_011202__164.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6658 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240201_011210__566 | 0 | 0.0 | 8.37438 | 0 | [0, 203] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_011210__566.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6659 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240201_011233__196 | 0 | 0.0 | 23.4744 | 0 | [0, 566] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_011233__196.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6660 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240201_011237__864 | 0 | 0.0 | 3.24821 | 0 | [0, 80] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_011237__864.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6661 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240201_011241__449 | 0 | 0.0 | 3.84443 | 0 | [0, 94] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_011241__449.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6662 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_011039__221 | 0 | 0.0 | 7.71243 | 0 | [0, 187] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_011039__221.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6663 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_011050__304 | 0 | 0.0 | 11.2785 | 0 | [0, 273] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_011050__304.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6664 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_011101__825 | 0 | 0.0 | 11.1839 | 0 | [0, 271] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_011101__825.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6665 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_011110__820 | 0 | 0.0 | 8.48866 | 0 | [0, 206] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_011110__820.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6666 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_011115__674 | 0 | 0.0 | 5.48312 | 0 | [0, 133] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_011115__674.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6667 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_010948__389 | 0 | 0.0 | 0.12469 | 0 | [0, 3] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_010948__389.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6668 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_010948__497 | 0 | 0.0 | 0.125725 | 0 | [0, 3] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_010948__497.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6669 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_010948__523 | 0 | 0.0 | 0.124799 | 0 | [0, 3] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_010948__523.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6670 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_010948__528 | 0 | 0.0 | 0.124863 | 0 | [0, 3] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_010948__528.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6671 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_010948__565 | 0 | 0.0 | 0.124712 | 0 | [0, 3] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_010948__565.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6672 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_011441__408 | 0 | 0.0 | 1.51697 | 0 | [0, 37] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_011441__408.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6673 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_011448__656 | 0 | 0.0 | 7.09016 | 0 | [0, 172] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_011448__656.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6674 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_011459__805 | 0 | 0.0 | 10.9177 | 0 | [0, 264] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_011459__805.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6675 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_011501__902 | 0 | 0.0 | 1.76582 | 0 | [0, 43] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_011501__902.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6676 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_011507__602 | 0 | 0.0 | 6.46759 | 0 | [0, 157] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_011507__602.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6677 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_011328__726 | 0 | 0.0 | 8.46696 | 0 | [0, 205] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_011328__726.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6678 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_011334__214 | 0 | 0.0 | 6.0484 | 0 | [0, 147] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_011334__214.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6679 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_011340__827 | 0 | 0.0 | 6.1283 | 0 | [0, 149] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_011340__827.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6680 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_011345__203 | 0 | 0.0 | 4.816 | 0 | [0, 117] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_011345__203.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6681 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_011353__374 | 0 | 0.0 | 7.87039 | 0 | [0, 191] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_011353__374.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6682 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_010248__354 | 0 | 0.0 | 3.36022 | 0 | [0, 63] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_010248__354.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6683 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_010305__643 | 0 | 0.0 | 16.9459 | 0 | [0, 317] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_010305__643.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6684 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_010322__458 | 0 | 0.0 | 17.5853 | 0 | [0, 329] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_010322__458.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6685 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_010336__746 | 0 | 0.0 | 13.5566 | 0 | [0, 254] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_010336__746.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6686 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_010356__629 | 0 | 0.0 | 20.3199 | 0 | [0, 380] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_010356__629.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6687 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_010053__193 | 0 | 0.0 | 15.9744 | 0 | [0, 298] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_010053__193.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6688 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_010101__164 | 0 | 0.0 | 7.98618 | 0 | [0, 148] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_010101__164.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6689 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_010111__407 | 0 | 0.0 | 9.7529 | 0 | [0, 181] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_010111__407.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6690 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_010123__550 | 0 | 0.0 | 11.7151 | 0 | [0, 219] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_010123__550.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6691 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_010153__953 | 1 | 0.0 | 30.1615 | 2 | [0, 552] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_010153__953.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 6692 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_005824__145 | 0 | 0.0 | 13.7866 | 0 | [0, 257] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_005824__145.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6693 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_005851__605 | 0 | 0.0 | 27.3494 | 0 | [0, 507] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_005851__605.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6694 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_005855__532 | 0 | 0.0 | 4.06874 | 0 | [0, 76] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_005855__532.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6695 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_005909__647 | 0 | 0.0 | 14.0191 | 0 | [0, 261] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_005909__647.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6696 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_005921__923 | 0 | 0.0 | 12.2805 | 0 | [0, 229] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_005921__923.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6697 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_010810__731 | 0 | 0.0 | 5.35063 | 0 | [0, 99] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_010810__731.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6698 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_010820__683 | 0 | 0.0 | 10.2574 | 0 | [0, 190] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_010820__683.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6699 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_010835__208 | 0 | 0.0 | 14.1546 | 0 | [0, 260] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_010835__208.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6700 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_010845__757 | 0 | 0.0 | 10.4828 | 0 | [0, 192] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_010845__757.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6701 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_010900__483 | 0 | 0.0 | 15.321 | 0 | [0, 281] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_010900__483.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6702 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_010530__478 | 1 | 0.0 | 19.1506 | 1 | [0, 354] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_010530__478.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6703 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_010604__307 | 0 | 0.0 | 33.1994 | 0 | [0, 611] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_010604__307.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6704 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_010635__187 | 0 | 0.0 | 31.5386 | 0 | [0, 578] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_010635__187.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6705 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_010638__911 | 0 | 0.0 | 3.20587 | 0 | [0, 59] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_010638__911.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6706 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_010653__536 | 0 | 0.0 | 14.8614 | 0 | [0, 272] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_010653__536.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6707 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_011834__784 | 1 | 0.0 | 1.37658 | 2 | [0, 167] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_011834__784.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 6708 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_011837__939 | 1 | 0.0 | 2.85362 | 1 | [0, 345] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_011837__939.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6709 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_011839__701 | 0 | 0.0 | 2.15504 | 0 | [0, 261] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_011839__701.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6710 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_011841__279 | 0 | 0.0 | 2.00667 | 0 | [0, 243] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_011841__279.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6711 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_011844__274 | 0 | 0.0 | 2.31874 | 0 | [0, 281] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_011844__274.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6712 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_011817__845 | 0 | 0.0 | 0.595044 | 0 | [0, 72] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_011817__845.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6713 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_011819__957 | 0 | 0.0 | 1.92758 | 0 | [0, 223] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_011819__957.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6714 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_011820__362 | 0 | 0.0 | 1.25289 | 0 | [0, 147] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_011820__362.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6715 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_011822__475 | 1 | 0.0 | 1.68897 | 2 | [0, 198] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_011822__475.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 6716 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_011823__966 | 0 | 0.0 | 1.16058 | 0 | [0, 135] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_011823__966.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6717 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_011802__747 | 0 | 0.0 | 0.711858 | 0 | [0, 86] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_011802__747.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6718 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_011803__133 | 0 | 0.0 | 1.35026 | 0 | [0, 163] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_011803__133.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6719 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_011805__139 | 0 | 0.0 | 1.21216 | 0 | [0, 144] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_011805__139.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6720 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_011805__406 | 0 | 0.0 | 0.702993 | 0 | [0, 85] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_011805__406.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6721 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_011806__803 | 0 | 0.0 | 0.70198 | 0 | [0, 85] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_011806__803.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6722 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_011931__374 | 0 | 0.0 | 2.23962 | 0 | [0, 264] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_011931__374.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6723 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_011935__658 | 0 | 0.0 | 2.73718 | 0 | [0, 322] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_011935__658.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6724 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240201_011936__663 | 0 | 0.0 | 1.55941 | 0 | [0, 185] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_011936__663.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6725 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_011938__385 | 0 | 0.0 | 1.95064 | 0 | [0, 231] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_011938__385.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6726 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_011939__799 | 0 | 0.0 | 1.16855 | 0 | [0, 139] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_011939__799.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6727 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_011858__426 | 1 | 0.0 | 1.50592 | 1 | [0, 179] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_011858__426.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6728 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_011900__669 | 0 | 0.0 | 1.83216 | 0 | [0, 217] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_011900__669.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6729 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_011902__902 | 0 | 0.0 | 2.18947 | 0 | [0, 259] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_011902__902.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6730 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_011905__362 | 0 | 0.0 | 2.77175 | 0 | [0, 325] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_011905__362.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6731 | NVIDIA-RTX-4090-4x | extract_julia_code | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_011909__100 | 0 | 0.0 | 3.43682 | 1 | [0, 405] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_011909__100.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6732 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_234725__653 | 0 | 0.0 | 17.6313 | 0 | [72, 518] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_234725__653.json | 0.0 | missing | missing | missing | |
| 6733 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_234738__521 | 0 | 0.0 | 12.984 | 0 | [1, 401] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_234738__521.json | 0.0 | missing | missing | missing | |
| 6734 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_234752__500 | 0 | 0.0 | 13.9434 | 0 | [1, 429] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231219_234752__500.json | 0.0 | missing | missing | missing | |
| 6735 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_145012__103 | 0 | 0.0 | 38.9957 | 0 | [87, 233] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_145012__103.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6736 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_145056__195 | 0 | 0.0 | 43.9804 | 0 | [87, 264] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_145056__195.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6737 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_234655__125 | 0 | 0.0 | 10.6874 | 0 | [1, 334] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_234655__125.json | 0.0 | missing | missing | missing | |
| 6738 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_234708__156 | 0 | 0.0 | 12.0575 | 0 | [1, 374] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_234708__156.json | 25.0 | missing | missing | missing | |
| 6739 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_144900__231 | 1 | 0.0 | 45.4329 | 1 | [90, 273] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_144900__231.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6740 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_144932__862 | 1 | 0.0 | 30.7255 | 1 | [90, 177] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_144932__862.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6741 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_003537__647 | 0 | 0.0 | 43.382 | 1 | [90, 257] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_003537__647.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6742 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_234623__114 | 0 | 0.0 | 2.24022 | 0 | [1, 72] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_234623__114.json | 25.0 | missing | missing | missing | |
| 6743 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_234632__484 | 0 | 0.0 | 8.7994 | 0 | [1, 274] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_234632__484.json | 50.0 | missing | missing | missing | |
| 6744 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_144724__475 | 1 | 0.0 | 71.5143 | 1 | [131, 424] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_144724__475.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6745 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_144815__751 | 0 | 0.0 | 49.8838 | 0 | [131, 291] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_144815__751.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6746 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_003453__641 | 0 | 0.0 | 140.334 | 0 | [131, 810] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_003453__641.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6747 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_234607__696 | 0 | 0.0 | 4.35872 | 0 | [1, 134] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_234607__696.json | 0.0 | missing | missing | missing | |
| 6748 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_234613__536 | 0 | 0.0 | 5.98466 | 0 | [1, 183] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_234613__536.json | 0.0 | missing | missing | missing | |
| 6749 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_144536__795 | 0 | 0.0 | 57.0753 | 2 | [224, 161] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_144536__795.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 6750 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_144612__855 | 1 | 0.0 | 36.0845 | 1 | [224, 193] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_144612__855.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6751 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_003233__648 | 0 | 0.0 | 44.8647 | 0 | [224, 97] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_003233__648.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6752 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_234852__274 | 0 | 0.0 | 20.4252 | 0 | [1, 557] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_234852__274.json | 0.0 | missing | missing | missing | |
| 6753 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_234914__491 | 0 | 0.0 | 22.0867 | 0 | [1, 598] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_234914__491.json | 0.0 | missing | missing | missing | |
| 6754 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_145305__691 | 1 | 0.0 | 74.0036 | 2 | [419, 385] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_145305__691.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 6755 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_145349__154 | 0 | 0.0 | 43.5632 | 1 | [419, 202] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_145349__154.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6756 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_003731__364 | 1 | 0.0 | 65.6117 | 2 | [419, 329] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_003731__364.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 6757 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_234817__538 | 0 | 0.0 | 1.22261 | 0 | [1, 36] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_234817__538.json | 0.0 | missing | missing | missing | |
| 6758 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_234829__105 | 0 | 0.0 | 11.9802 | 0 | [1, 338] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_234829__105.json | 50.0 | missing | missing | missing | |
| 6759 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_145129__673 | 1 | 0.0 | 33.2935 | 2 | [417, 140] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_145129__673.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 6760 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_145150__892 | 1 | 0.0 | 20.8068 | 1 | [417, 63] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_145150__892.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6761 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_003625__313 | 1 | 0.0 | 47.3912 | 2 | [417, 225] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_003625__313.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 6762 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_004318__971 | 0 | 0.0 | 12.9747 | 0 | [89, 494] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_004318__971.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6763 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_114514__282 | 0 | 0.0 | 10.1774 | 0 | [89, 390] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_114514__282.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6764 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_114528__209 | 0 | 0.0 | 14.3481 | 0 | [89, 544] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_114528__209.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6765 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_114541__757 | 0 | 0.0 | 12.6967 | 0 | [89, 484] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_114541__757.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6766 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_004305__873 | 0 | 0.0 | 3.26013 | 0 | [126, 117] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_004305__873.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6767 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_114450__993 | 0 | 0.0 | 5.01629 | 0 | [126, 186] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_114450__993.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6768 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_114456__508 | 0 | 0.0 | 5.93223 | 0 | [126, 222] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_114456__508.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6769 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_114503__288 | 0 | 0.0 | 7.12582 | 0 | [126, 268] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_114503__288.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6770 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_004302__549 | 0 | 0.0 | 5.46064 | 0 | [217, 58] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_004302__549.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6771 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_114437__146 | 0 | 0.0 | 7.838 | 0 | [217, 146] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_114437__146.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6772 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_114442__482 | 0 | 0.0 | 5.42468 | 0 | [217, 187] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_114442__482.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6773 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_114445__882 | 0 | 0.0 | 3.19949 | 0 | [217, 101] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_114445__882.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6774 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_004331__116 | 0 | 0.0 | 7.03182 | 0 | [378, 219] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_004331__116.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6775 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_114621__290 | 0 | 0.0 | 8.397 | 0 | [378, 269] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_114621__290.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6776 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_114628__936 | 0 | 0.0 | 6.57376 | 0 | [378, 202] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_114628__936.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6777 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_114638__374 | 0 | 0.0 | 9.70475 | 0 | [378, 316] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_114638__374.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6778 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_004324__750 | 0 | 0.0 | 5.65424 | 0 | [375, 168] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_004324__750.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6779 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_114551__418 | 0 | 0.0 | 9.9777 | 0 | [375, 326] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_114551__418.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6780 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_114600__157 | 0 | 0.0 | 9.19691 | 0 | [375, 298] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_114600__157.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6781 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_114613__598 | 0 | 0.0 | 12.9481 | 0 | [375, 432] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_114613__598.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6782 | Apple-MacBook-Pro-M1 | extract_julia_code | gemini-1.0-pro-latest | InJulia | 1SHOT | true | false | 5 | 20240217_110604__501 | 0 | 0.0 | 2.06435 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_110604__501.json | 25.0 | missing | missing | missing | |
| 6783 | Apple-MacBook-Pro-M1 | extract_julia_code | gemini-1.0-pro-latest | InJulia | 1SHOT | false | false | 5 | 20240217_110607__503 | 0 | 0.0 | 2.95979 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_110607__503.json | 0.0 | missing | missing | missing | |
| 6784 | Apple-MacBook-Pro-M1 | extract_julia_code | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_110609__908 | 0 | 0.0 | 2.53188 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_110609__908.json | 50.0 | missing | missing | missing | |
| 6785 | Apple-MacBook-Pro-M1 | extract_julia_code | gemini-1.0-pro-latest | InJulia | 1SHOT | true | false | 5 | 20240217_110612__898 | 0 | 0.0 | 2.80807 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_110612__898.json | 25.0 | missing | missing | missing | |
| 6786 | Apple-MacBook-Pro-M1 | extract_julia_code | gemini-1.0-pro-latest | InJulia | 1SHOT | false | false | 5 | 20240217_110616__453 | 0 | 0.0 | 4.18386 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_110616__453.json | 0.0 | missing | missing | missing | |
| 6787 | Apple-MacBook-Pro-M1 | extract_julia_code | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_110531__861 | 1 | 0.0 | 1.82251 | 2 | [0, 0] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_110531__861.json | 67.5 | missing | missing | missing | |
| 6788 | Apple-MacBook-Pro-M1 | extract_julia_code | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_110534__925 | 0 | 0.0 | 2.7088 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_110534__925.json | 50.0 | missing | missing | missing | |
| 6789 | Apple-MacBook-Pro-M1 | extract_julia_code | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_110536__621 | 0 | 0.0 | 2.12944 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_110536__621.json | 50.0 | missing | missing | missing | |
| 6790 | Apple-MacBook-Pro-M1 | extract_julia_code | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240217_110543__643 | 0 | 0.0 | 6.28392 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_110543__643.json | 0.0 | missing | missing | missing | |
| 6791 | Apple-MacBook-Pro-M1 | extract_julia_code | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240217_113827__505 | 0 | 0.0 | 2.24605 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_113827__505.json | 0.0 | missing | missing | missing | |
| 6792 | Apple-MacBook-Pro-M1 | extract_julia_code | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240217_110500__893 | 0 | 0.0 | 5.44396 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_110500__893.json | 50.0 | missing | missing | missing | |
| 6793 | Apple-MacBook-Pro-M1 | extract_julia_code | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240217_110503__685 | 0 | 0.0 | 3.14445 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_110503__685.json | 50.0 | missing | missing | missing | |
| 6794 | Apple-MacBook-Pro-M1 | extract_julia_code | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240217_110506__642 | 0 | 0.0 | 2.4813 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_110506__642.json | 50.0 | missing | missing | missing | |
| 6795 | Apple-MacBook-Pro-M1 | extract_julia_code | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240217_110513__830 | 0 | 0.0 | 7.01422 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_110513__830.json | 0.0 | missing | missing | missing | |
| 6796 | Apple-MacBook-Pro-M1 | extract_julia_code | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240217_110515__685 | 0 | 0.0 | 1.99967 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_110515__685.json | 50.0 | missing | missing | missing | |
| 6797 | Apple-MacBook-Pro-M1 | extract_julia_code | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240217_110714__944 | 0 | 0.0 | 7.26757 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_110714__944.json | 50.0 | missing | missing | missing | |
| 6798 | Apple-MacBook-Pro-M1 | extract_julia_code | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240217_110719__566 | 0 | 0.0 | 4.78472 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_110719__566.json | 50.0 | missing | missing | missing | |
| 6799 | Apple-MacBook-Pro-M1 | extract_julia_code | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240217_110721__439 | 0 | 0.0 | 1.81411 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_110721__439.json | 50.0 | missing | missing | missing | |
| 6800 | Apple-MacBook-Pro-M1 | extract_julia_code | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240217_110726__514 | 0 | 0.0 | 4.82167 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_110726__514.json | 50.0 | missing | missing | missing | |
| 6801 | Apple-MacBook-Pro-M1 | extract_julia_code | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240217_110728__359 | 0 | 0.0 | 2.05635 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_110728__359.json | 50.0 | missing | missing | missing | |
| 6802 | Apple-MacBook-Pro-M1 | extract_julia_code | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_110639__410 | 0 | 0.0 | 2.05078 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_110639__410.json | 50.0 | missing | missing | missing | |
| 6803 | Apple-MacBook-Pro-M1 | extract_julia_code | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_110643__421 | 1 | 0.0 | 4.76285 | 2 | [0, 0] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_110643__421.json | 67.5 | missing | missing | missing | |
| 6804 | Apple-MacBook-Pro-M1 | extract_julia_code | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | false | 5 | 20240217_110647__702 | 0 | 0.0 | 3.1949 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_110647__702.json | 25.0 | missing | missing | missing | |
| 6805 | Apple-MacBook-Pro-M1 | extract_julia_code | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_110649__373 | 0 | 0.0 | 2.08529 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_110649__373.json | 50.0 | missing | missing | missing | |
| 6806 | Apple-MacBook-Pro-M1 | extract_julia_code | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_110651__209 | 0 | 0.0 | 2.28468 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_110651__209.json | 50.0 | missing | missing | missing | |
| 6807 | Apple-MacBook-Pro-M1 | extract_julia_code | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 5 | 20240223_231458__475 | 0 | 0.0 | 9.68119 | 0 | [0, 149] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_231458__475.json | 0.0 | missing | missing | missing | |
| 6808 | Apple-MacBook-Pro-M1 | extract_julia_code | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | false | 5 | 20240223_231507__776 | 0 | 0.0 | 8.9875 | 0 | [0, 134] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_231507__776.json | 25.0 | missing | missing | missing | |
| 6809 | Apple-MacBook-Pro-M1 | extract_julia_code | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | false | 5 | 20240223_231513__459 | 0 | 0.0 | 5.93247 | 0 | [0, 93] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_231513__459.json | 25.0 | missing | missing | missing | |
| 6810 | Apple-MacBook-Pro-M1 | extract_julia_code | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 5 | 20240223_231522__926 | 0 | 0.0 | 9.77333 | 0 | [0, 149] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_231522__926.json | 0.0 | missing | missing | missing | |
| 6811 | Apple-MacBook-Pro-M1 | extract_julia_code | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | true | 5 | 20240223_231532__265 | 0 | 0.0 | 9.08752 | 0 | [0, 139] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_231532__265.json | 50.0 | missing | missing | missing | |
| 6812 | Apple-MacBook-Pro-M1 | extract_julia_code | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240223_231344__766 | 0 | 0.0 | 2.34516 | 0 | [0, 37] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_231344__766.json | 50.0 | missing | missing | missing | |
| 6813 | Apple-MacBook-Pro-M1 | extract_julia_code | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240223_231346__148 | 0 | 0.0 | 2.23712 | 0 | [0, 35] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_231346__148.json | 0.0 | missing | missing | missing | |
| 6814 | Apple-MacBook-Pro-M1 | extract_julia_code | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240223_231350__234 | 0 | 0.0 | 3.75485 | 2 | [0, 58] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_231350__234.json | 62.5 | missing | missing | missing | |
| 6815 | Apple-MacBook-Pro-M1 | extract_julia_code | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240223_231352__562 | 0 | 0.0 | 2.48822 | 0 | [0, 38] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_231352__562.json | 50.0 | missing | missing | missing | |
| 6816 | Apple-MacBook-Pro-M1 | extract_julia_code | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240223_231406__947 | 0 | 0.0 | 13.3583 | 2 | [0, 203] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_231406__947.json | 62.5 | missing | missing | missing | |
| 6817 | Apple-MacBook-Pro-M1 | extract_julia_code | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240223_231237__619 | 0 | 0.0 | 10.881 | 0 | [0, 164] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_231237__619.json | 0.0 | missing | missing | missing | |
| 6818 | Apple-MacBook-Pro-M1 | extract_julia_code | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240223_231247__335 | 1 | 0.0 | 10.5896 | 2 | [0, 162] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_231247__335.json | 67.5 | missing | missing | missing | |
| 6819 | Apple-MacBook-Pro-M1 | extract_julia_code | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240223_231256__280 | 0 | 0.0 | 8.61857 | 0 | [0, 131] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_231256__280.json | 0.0 | missing | missing | missing | |
| 6820 | Apple-MacBook-Pro-M1 | extract_julia_code | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240223_231307__135 | 0 | 0.0 | 11.186 | 0 | [0, 168] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_231307__135.json | 50.0 | missing | missing | missing | |
| 6821 | Apple-MacBook-Pro-M1 | extract_julia_code | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240223_231327__173 | 0 | 0.0 | 19.6404 | 0 | [0, 302] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_231327__173.json | 0.0 | missing | missing | missing | |
| 6822 | Apple-MacBook-Pro-M1 | extract_julia_code | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_231941__984 | 0 | 0.0 | 17.7067 | 0 | [0, 268] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_231941__984.json | 50.0 | missing | missing | missing | |
| 6823 | Apple-MacBook-Pro-M1 | extract_julia_code | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_232000__955 | 0 | 0.0 | 18.5352 | 0 | [0, 280] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_232000__955.json | 50.0 | missing | missing | missing | |
| 6824 | Apple-MacBook-Pro-M1 | extract_julia_code | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_232018__333 | 0 | 0.0 | 18.0586 | 0 | [0, 274] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_232018__333.json | 50.0 | missing | missing | missing | |
| 6825 | Apple-MacBook-Pro-M1 | extract_julia_code | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_232038__393 | 0 | 0.0 | 19.5419 | 0 | [0, 295] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_232038__393.json | 50.0 | missing | missing | missing | |
| 6826 | Apple-MacBook-Pro-M1 | extract_julia_code | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_232054__120 | 0 | 0.0 | 16.333 | 0 | [0, 247] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_232054__120.json | 50.0 | missing | missing | missing | |
| 6827 | Apple-MacBook-Pro-M1 | extract_julia_code | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240223_231658__966 | 0 | 0.0 | 16.2722 | 0 | [0, 247] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_231658__966.json | 50.0 | missing | missing | missing | |
| 6828 | Apple-MacBook-Pro-M1 | extract_julia_code | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240223_231715__573 | 0 | 0.0 | 16.7888 | 0 | [0, 255] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_231715__573.json | 0.0 | missing | missing | missing | |
| 6829 | Apple-MacBook-Pro-M1 | extract_julia_code | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240223_231727__474 | 0 | 0.0 | 11.8616 | 0 | [0, 179] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_231727__474.json | 50.0 | missing | missing | missing | |
| 6830 | Apple-MacBook-Pro-M1 | extract_julia_code | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240223_231744__979 | 0 | 0.0 | 17.5271 | 0 | [0, 265] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_231744__979.json | 0.0 | missing | missing | missing | |
| 6831 | Apple-MacBook-Pro-M1 | extract_julia_code | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240223_231803__717 | 0 | 0.0 | 19.0397 | 0 | [0, 288] | 0.13.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_231803__717.json | 0.0 | missing | missing | missing | |
| 6832 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | AsIs | 1SHOT | true | true | 5 | 20231213_201931__152 | 0 | 0.000483 | 6.45669 | 0 | [78, 296] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231213_201931__152.json | 50.0 | missing | missing | missing | |
| 6833 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | AsIs | 1SHOT | true | true | 5 | 20231225_192523__781 | 0 | 0.000231 | 2.18506 | 0 | [78, 128] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_192523__781.json | 50.0 | missing | missing | missing | |
| 6834 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | AsIs | 1SHOT | true | true | 5 | 20231225_192524__145 | 0 | 0.000123 | 1.24863 | 0 | [78, 56] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_192524__145.json | 50.0 | missing | missing | missing | |
| 6835 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo--optim | AsIs | 1SHOT | true | true | 5 | 20231215_194217__960 | 0 | 0.0 | 4.11375 | 0 | [78, 177] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231215_194217__960.json | 50.0 | 0.5 | missing | 0.5 | |
| 6836 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231213_201925__778 | 1 | 0.000279 | 3.52934 | 1 | [81, 159] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231213_201925__778.json | 61.25 | missing | missing | missing | |
| 6837 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231225_192518__598 | 0 | 0.000237 | 2.70846 | 0 | [81, 131] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_192518__598.json | 50.0 | missing | missing | missing | |
| 6838 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231225_192520__488 | 1 | 0.000261 | 2.50785 | 1 | [81, 147] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_192520__488.json | 61.25 | missing | missing | missing | |
| 6839 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231227_200043__186 | 1 | 0.000444 | 5.03352 | 2 | [81, 269] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_200043__186.json | 67.5 | missing | missing | missing | |
| 6840 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | InJulia | 1SHOT | false | false | 5 | 20231227_200048__289 | 0 | 0.0004245 | 4.63335 | 0 | [81, 256] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_200048__289.json | 0.0 | missing | missing | missing | |
| 6841 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo--optim | InJulia | 1SHOT | true | true | 5 | 20231215_194213__995 | 0 | 0.0 | 2.7152 | 0 | [81, 131] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231215_194213__995.json | 50.0 | 0.5 | missing | 0.5 | |
| 6842 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_201921__632 | 1 | 0.0001615 | 1.8594 | 1 | [116, 69] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231213_201921__632.json | 61.25 | missing | missing | missing | |
| 6843 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_192513__823 | 0 | 0.0001585 | 1.53213 | 0 | [116, 67] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_192513__823.json | 50.0 | missing | missing | missing | |
| 6844 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_192515__776 | 0 | 0.000193 | 1.89206 | 2 | [116, 90] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_192515__776.json | 62.5 | missing | missing | missing | |
| 6845 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_200036__739 | 0 | 0.000145 | 1.76092 | 0 | [116, 58] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_200036__739.json | 50.0 | missing | missing | missing | |
| 6846 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_200038__957 | 1 | 0.0001705 | 1.68238 | 2 | [116, 75] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_200038__957.json | 67.5 | missing | missing | missing | |
| 6847 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_194210__454 | 0 | 0.0 | 1.57719 | 0 | [116, 52] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231215_194210__454.json | 50.0 | 0.5 | missing | 0.5 | |
| 6848 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_201919__471 | 0 | 0.000227 | 2.6679 | 1 | [190, 88] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231213_201919__471.json | 56.25 | missing | missing | missing | |
| 6849 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_192510__311 | 0 | 0.000203 | 1.71567 | 0 | [190, 72] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_192510__311.json | 50.0 | missing | missing | missing | |
| 6850 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_192511__738 | 0 | 0.000167 | 1.32241 | 0 | [190, 48] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_192511__738.json | 50.0 | missing | missing | missing | |
| 6851 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_200032__815 | 0 | 0.000281 | 2.39296 | 0 | [190, 124] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_200032__815.json | 50.0 | missing | missing | missing | |
| 6852 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_200034__915 | 1 | 0.000182 | 1.39554 | 1 | [190, 58] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_200034__915.json | 61.25 | missing | missing | missing | |
| 6853 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_194208__882 | 0 | 0.0 | 1.47376 | 0 | [190, 50] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231215_194208__882.json | 50.0 | 0.5 | missing | 0.5 | |
| 6854 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_201944__727 | 0 | 0.000504 | 5.34757 | 0 | [339, 223] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231213_201944__727.json | 50.0 | missing | missing | missing | |
| 6855 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_192531__454 | 0 | 0.0004065 | 2.55756 | 0 | [339, 158] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_192531__454.json | 0.0 | missing | missing | missing | |
| 6856 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_192533__599 | 0 | 0.0003825 | 2.14311 | 0 | [339, 142] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_192533__599.json | 0.0 | missing | missing | missing | |
| 6857 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_200102__219 | 0 | 0.0004005 | 2.88795 | 0 | [339, 154] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_200102__219.json | 0.0 | missing | missing | missing | |
| 6858 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_200108__506 | 0 | 0.0006645 | 6.2758 | 0 | [339, 330] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_200108__506.json | 50.0 | missing | missing | missing | |
| 6859 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_194240__663 | 1 | 0.0 | 18.1039 | 2 | [339, 781] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231215_194240__663.json | 67.5 | 0.5 | missing | 0.5 | |
| 6860 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_201939__995 | 0 | 0.0006865 | 7.18878 | 0 | [338, 345] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231213_201939__995.json | 0.0 | missing | missing | missing | |
| 6861 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_192526__380 | 0 | 0.000289 | 1.61883 | 0 | [338, 80] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_192526__380.json | 0.0 | missing | missing | missing | |
| 6862 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_192529__444 | 0 | 0.000433 | 2.9683 | 0 | [338, 176] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_192529__444.json | 50.0 | missing | missing | missing | |
| 6863 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_200052__894 | 0 | 0.000511 | 4.35215 | 0 | [338, 228] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_200052__894.json | 25.0 | missing | missing | missing | |
| 6864 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_200059__787 | 0 | 0.0006985 | 6.93057 | 0 | [338, 353] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_200059__787.json | 50.0 | missing | missing | missing | |
| 6865 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_194221__943 | 0 | 0.0 | 4.45707 | 0 | [338, 199] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231215_194221__943.json | 50.0 | 0.5 | missing | 0.5 | |
| 6866 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200439__502 | 0 | 0.0001725 | 0.870995 | 0 | [81, 88] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200439__502.json | 50.0 | missing | missing | missing | |
| 6867 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200441__234 | 1 | 0.00027 | 1.23316 | 1 | [81, 153] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200441__234.json | 61.25 | missing | missing | missing | |
| 6868 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200441__642 | 0 | 0.000153 | 0.951789 | 0 | [81, 75] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200441__642.json | 50.0 | missing | missing | missing | |
| 6869 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200442__571 | 0 | 0.0001155 | 0.563792 | 0 | [81, 50] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200442__571.json | 50.0 | missing | missing | missing | |
| 6870 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200443__858 | 1 | 0.000246 | 1.30826 | 1 | [81, 137] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200443__858.json | 61.25 | missing | missing | missing | |
| 6871 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200436__531 | 1 | 0.0001345 | 0.634847 | 1 | [116, 51] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200436__531.json | 61.25 | missing | missing | missing | |
| 6872 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200436__614 | 0 | 0.0001285 | 0.770716 | 0 | [116, 47] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200436__614.json | 50.0 | missing | missing | missing | |
| 6873 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_200437__249 | 0 | 0.0001345 | 0.765366 | 0 | [116, 51] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200437__249.json | 0.0 | missing | missing | missing | |
| 6874 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200438__585 | 0 | 0.0001315 | 0.54624 | 0 | [116, 49] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200438__585.json | 50.0 | missing | missing | missing | |
| 6875 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200438__601 | 0 | 0.000121 | 0.556087 | 0 | [116, 42] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200438__601.json | 50.0 | missing | missing | missing | |
| 6876 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200432__603 | 1 | 0.000329 | 1.41657 | 1 | [190, 156] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200432__603.json | 61.25 | missing | missing | missing | |
| 6877 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200432__767 | 0 | 0.0001625 | 0.623037 | 0 | [190, 45] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200432__767.json | 50.0 | missing | missing | missing | |
| 6878 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200433__349 | 1 | 0.0002255 | 0.986615 | 1 | [190, 87] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200433__349.json | 61.25 | missing | missing | missing | |
| 6879 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200434__800 | 0 | 0.0002045 | 0.904554 | 0 | [190, 73] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200434__800.json | 50.0 | missing | missing | missing | |
| 6880 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200435__784 | 0 | 0.0002345 | 0.803338 | 0 | [190, 93] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200435__784.json | 50.0 | missing | missing | missing | |
| 6881 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200450__580 | 0 | 0.000381 | 1.09854 | 0 | [339, 141] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200450__580.json | 0.0 | missing | missing | missing | |
| 6882 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_200451__168 | 0 | 0.000399 | 1.45308 | 0 | [339, 153] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200451__168.json | 50.0 | missing | missing | missing | |
| 6883 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200453__666 | 0 | 0.0004455 | 1.58579 | 0 | [339, 184] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200453__666.json | 0.0 | missing | missing | missing | |
| 6884 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_200454__266 | 0 | 0.0003135 | 0.925194 | 0 | [339, 96] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200454__266.json | 50.0 | missing | missing | missing | |
| 6885 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200456__295 | 0 | 0.0004365 | 1.54262 | 0 | [339, 178] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200456__295.json | 0.0 | missing | missing | missing | |
| 6886 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200444__458 | 0 | 0.000274 | 0.670019 | 0 | [338, 70] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200444__458.json | 50.0 | missing | missing | missing | |
| 6887 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200445__736 | 0 | 0.0002695 | 0.857556 | 0 | [338, 67] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200445__736.json | 50.0 | missing | missing | missing | |
| 6888 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200446__428 | 1 | 0.000385 | 1.11562 | 2 | [338, 144] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200446__428.json | 67.5 | missing | missing | missing | |
| 6889 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200447__461 | 1 | 0.000328 | 1.04407 | 2 | [338, 106] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200447__461.json | 67.5 | missing | missing | missing | |
| 6890 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_200449__196 | 0 | 0.000478 | 1.60808 | 0 | [338, 206] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200449__196.json | 0.0 | missing | missing | missing | |
| 6891 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | AsIs | 1SHOT | true | true | 5 | 20231213_201952__428 | 0 | 0.000304 | 3.01098 | 0 | [78, 113] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231213_201952__428.json | 50.0 | missing | missing | missing | |
| 6892 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | AsIs | 1SHOT | true | true | 5 | 20231225_192547__186 | 0 | 0.0003 | 1.88314 | 0 | [78, 111] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_192547__186.json | 50.0 | missing | missing | missing | |
| 6893 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | AsIs | 1SHOT | true | true | 5 | 20231225_192549__596 | 1 | 0.00029 | 2.04514 | 1 | [78, 106] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_192549__596.json | 61.25 | missing | missing | missing | |
| 6894 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106--optim | AsIs | 1SHOT | true | true | 5 | 20231215_194248__982 | 0 | 0.0 | 2.3424 | 0 | [78, 73] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231215_194248__982.json | 50.0 | 0.9 | missing | 0.1 | |
| 6895 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231213_201949__254 | 1 | 0.000291 | 1.74816 | 1 | [81, 105] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231213_201949__254.json | 61.25 | missing | missing | missing | |
| 6896 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231225_192542__905 | 0 | 0.000367 | 2.5373 | 1 | [81, 143] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_192542__905.json | 56.25 | missing | missing | missing | |
| 6897 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231225_192545__567 | 1 | 0.000333 | 2.47306 | 1 | [81, 126] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_192545__567.json | 61.25 | missing | missing | missing | |
| 6898 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231227_200117__981 | 0 | 0.000425 | 3.58254 | 1 | [81, 172] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_200117__981.json | 56.25 | missing | missing | missing | |
| 6899 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231227_200119__323 | 0 | 0.000205 | 1.68624 | 0 | [81, 62] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_200119__323.json | 50.0 | missing | missing | missing | |
| 6900 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106--optim | InJulia | 1SHOT | true | true | 5 | 20231215_194245__750 | 0 | 0.0 | 1.40086 | 0 | [81, 70] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231215_194245__750.json | 50.0 | 0.9 | missing | 0.1 | |
| 6901 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_201947__332 | 0 | 0.000202 | 1.30704 | 0 | [116, 43] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231213_201947__332.json | 50.0 | missing | missing | missing | |
| 6902 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_192538__689 | 1 | 0.000304 | 1.86746 | 1 | [116, 94] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_192538__689.json | 61.25 | missing | missing | missing | |
| 6903 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_192539__646 | 0 | 0.00022 | 1.36405 | 0 | [116, 52] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_192539__646.json | 50.0 | missing | missing | missing | |
| 6904 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_200112__795 | 0 | 0.000256 | 1.76739 | 0 | [116, 70] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_200112__795.json | 50.0 | missing | missing | missing | |
| 6905 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_200114__978 | 0 | 0.000222 | 1.29816 | 0 | [116, 53] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_200114__978.json | 50.0 | missing | missing | missing | |
| 6906 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_194244__433 | 0 | 0.0 | 2.42128 | 0 | [116, 44] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231215_194244__433.json | 50.0 | 0.9 | missing | 0.1 | |
| 6907 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_201945__477 | 0 | 0.000292 | 1.07484 | 0 | [190, 51] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231213_201945__477.json | 50.0 | missing | missing | missing | |
| 6908 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_192535__290 | 0 | 0.000296 | 1.39333 | 1 | [190, 53] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_192535__290.json | 56.25 | missing | missing | missing | |
| 6909 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_192536__268 | 0 | 0.000294 | 0.995795 | 0 | [190, 52] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_192536__268.json | 50.0 | missing | missing | missing | |
| 6910 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_200109__574 | 0 | 0.000308 | 1.23559 | 0 | [190, 59] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_200109__574.json | 50.0 | missing | missing | missing | |
| 6911 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_200110__649 | 0 | 0.000258 | 0.840587 | 0 | [190, 34] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_200110__649.json | 50.0 | missing | missing | missing | |
| 6912 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_194241__818 | 0 | 0.0 | 1.46622 | 0 | [190, 50] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231215_194241__818.json | 50.0 | 0.9 | missing | 0.1 | |
| 6913 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231213_201957__991 | 0 | 0.000489 | 1.35501 | 0 | [339, 75] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231213_201957__991.json | 0.0 | missing | missing | missing | |
| 6914 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_192556__868 | 1 | 0.000687 | 2.44964 | 2 | [339, 174] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_192556__868.json | 67.5 | missing | missing | missing | |
| 6915 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_192557__755 | 0 | 0.000527 | 1.53423 | 0 | [339, 94] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_192557__755.json | 25.0 | missing | missing | missing | |
| 6916 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_200127__466 | 0 | 0.000495 | 1.82064 | 0 | [339, 78] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_200127__466.json | 0.0 | missing | missing | missing | |
| 6917 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_200130__890 | 0 | 0.000687 | 2.62209 | 0 | [339, 174] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_200130__890.json | 50.0 | missing | missing | missing | |
| 6918 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106--optim | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231215_194252__921 | 0 | 0.0 | 2.09549 | 0 | [339, 53] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231215_194252__921.json | 0.0 | 0.9 | missing | 0.1 | |
| 6919 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_201956__421 | 0 | 0.000768 | 3.81809 | 0 | [338, 215] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231213_201956__421.json | 0.0 | missing | missing | missing | |
| 6920 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_192551__405 | 1 | 0.000534 | 1.62023 | 2 | [338, 98] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_192551__405.json | 67.5 | missing | missing | missing | |
| 6921 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_192553__452 | 1 | 0.00058 | 2.01195 | 2 | [338, 121] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_192553__452.json | 67.5 | missing | missing | missing | |
| 6922 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_200123__673 | 1 | 0.000716 | 3.58081 | 2 | [338, 189] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_200123__673.json | 67.5 | missing | missing | missing | |
| 6923 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_200125__848 | 0 | 0.000644 | 2.2994 | 2 | [338, 153] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_200125__848.json | 62.5 | missing | missing | missing | |
| 6924 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-3.5-turbo-1106--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_194250__213 | 0 | 0.0 | 1.82213 | 0 | [338, 91] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231215_194250__213.json | 50.0 | 0.9 | missing | 0.1 | |
| 6925 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_092708__754 | 1 | 0.00753 | 15.0118 | 1 | [81, 224] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_092708__754.json | 61.25 | missing | missing | missing | |
| 6926 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_092726__516 | 0 | 0.00915 | 18.4564 | 0 | [81, 278] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_092726__516.json | 50.0 | missing | missing | missing | |
| 6927 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_092742__482 | 1 | 0.0081 | 16.1463 | 1 | [81, 243] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_092742__482.json | 61.25 | missing | missing | missing | |
| 6928 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_092755__313 | 1 | 0.00522 | 12.6769 | 1 | [81, 147] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_092755__313.json | 61.25 | missing | missing | missing | |
| 6929 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_092808__118 | 0 | 0.0063 | 13.1031 | 0 | [81, 183] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_092808__118.json | 50.0 | missing | missing | missing | |
| 6930 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_092455__549 | 0 | 0.00293 | 4.70474 | 0 | [116, 59] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_092455__549.json | 50.0 | missing | missing | missing | |
| 6931 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_092502__566 | 1 | 0.00356 | 6.50387 | 1 | [116, 80] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_092502__566.json | 61.25 | missing | missing | missing | |
| 6932 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_092506__273 | 0 | 0.00278 | 4.28059 | 0 | [116, 54] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_092506__273.json | 50.0 | missing | missing | missing | |
| 6933 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_092511__267 | 0 | 0.00284 | 5.14744 | 0 | [116, 56] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_092511__267.json | 50.0 | missing | missing | missing | |
| 6934 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_092518__443 | 0 | 0.0038 | 7.01057 | 0 | [116, 88] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_092518__443.json | 50.0 | missing | missing | missing | |
| 6935 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_092305__944 | 0 | 0.00607 | 9.17567 | 0 | [190, 139] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_092305__944.json | 0.0 | missing | missing | missing | |
| 6936 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_092323__762 | 1 | 0.0097 | 18.0387 | 1 | [190, 260] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_092323__762.json | 61.25 | missing | missing | missing | |
| 6937 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_092347__870 | 0 | 0.00877 | 24.2226 | 0 | [190, 229] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_092347__870.json | 0.0 | missing | missing | missing | |
| 6938 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_092406__610 | 1 | 0.00766 | 18.6784 | 1 | [190, 192] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_092406__610.json | 61.25 | missing | missing | missing | |
| 6939 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_092416__796 | 0 | 0.00544 | 10.2308 | 0 | [190, 118] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_092416__796.json | 50.0 | missing | missing | missing | |
| 6940 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_093653__757 | 0 | 0.00648 | 8.14243 | 0 | [339, 103] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_093653__757.json | 50.0 | missing | missing | missing | |
| 6941 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_093702__940 | 1 | 0.0066 | 9.06582 | 2 | [339, 107] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_093702__940.json | 67.5 | missing | missing | missing | |
| 6942 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_093757__802 | 1 | 0.0183 | 55.003 | 2 | [339, 497] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_093757__802.json | 67.5 | missing | missing | missing | |
| 6943 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_093841__418 | 0 | 0.01548 | 43.9026 | 0 | [339, 403] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_093841__418.json | 0.0 | missing | missing | missing | |
| 6944 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_093909__986 | 1 | 0.01254 | 27.8427 | 2 | [339, 305] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_093909__986.json | 67.5 | missing | missing | missing | |
| 6945 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_093135__529 | 1 | 0.00956 | 19.1759 | 2 | [338, 206] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_093135__529.json | 67.5 | missing | missing | missing | |
| 6946 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_093204__327 | 1 | 0.01358 | 29.6589 | 2 | [338, 340] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_093204__327.json | 67.5 | missing | missing | missing | |
| 6947 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_093248__636 | 1 | 0.01673 | 43.8784 | 2 | [338, 445] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_093248__636.json | 67.5 | missing | missing | missing | |
| 6948 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_093324__562 | 0 | 0.01427 | 36.1265 | 0 | [338, 363] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_093324__562.json | 50.0 | missing | missing | missing | |
| 6949 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_093334__815 | 1 | 0.00608 | 10.0043 | 2 | [338, 90] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_093334__815.json | 67.5 | missing | missing | missing | |
| 6950 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | AsIs | 1SHOT | true | true | 5 | 20231213_202210__460 | 1 | 0.01422 | 35.0696 | 1 | [78, 448] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231213_202210__460.json | 61.25 | missing | missing | missing | |
| 6951 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | AsIs | 1SHOT | true | true | 5 | 20231225_192707__601 | 1 | 0.00732 | 7.15928 | 1 | [78, 218] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_192707__601.json | 61.25 | missing | missing | missing | |
| 6952 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | AsIs | 1SHOT | true | true | 5 | 20231225_192724__280 | 1 | 0.00681 | 16.7752 | 1 | [78, 201] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_192724__280.json | 61.25 | missing | missing | missing | |
| 6953 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview--optim | AsIs | 1SHOT | false | false | 5 | 20231215_194543__584 | 0 | 0.0 | 67.335 | 0 | [78, 372] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231215_194543__584.json | 0.0 | 0.1 | missing | 0.9 | |
| 6954 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231213_202135__304 | 1 | 0.01461 | 41.1734 | 1 | [81, 460] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231213_202135__304.json | 61.25 | missing | missing | missing | |
| 6955 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231225_192647__564 | 1 | 0.0078 | 12.9101 | 1 | [81, 233] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_192647__564.json | 61.25 | missing | missing | missing | |
| 6956 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231225_192659__823 | 1 | 0.01068 | 11.627 | 1 | [81, 329] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_192659__823.json | 61.25 | missing | missing | missing | |
| 6957 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231227_200228__717 | 0 | 0.00714 | 18.184 | 0 | [81, 211] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_200228__717.json | 50.0 | missing | missing | missing | |
| 6958 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231227_200243__739 | 1 | 0.00867 | 15.7782 | 1 | [81, 262] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_200243__739.json | 61.25 | missing | missing | missing | |
| 6959 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview--optim | InJulia | 1SHOT | true | true | 5 | 20231215_194436__306 | 1 | 0.0 | 51.456 | 1 | [81, 393] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231215_194436__306.json | 61.25 | 0.1 | missing | 0.9 | |
| 6960 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_202054__184 | 0 | 0.00389 | 6.30827 | 0 | [116, 91] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231213_202054__184.json | 50.0 | missing | missing | missing | |
| 6961 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_192631__723 | 0 | 0.00575 | 7.93756 | 0 | [116, 153] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_192631__723.json | 50.0 | missing | missing | missing | |
| 6962 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_192634__425 | 0 | 0.00323 | 2.41713 | 0 | [116, 69] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_192634__425.json | 50.0 | missing | missing | missing | |
| 6963 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_200203__907 | 0 | 0.00422 | 5.84569 | 0 | [116, 102] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_200203__907.json | 50.0 | missing | missing | missing | |
| 6964 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_200209__513 | 0 | 0.00323 | 6.12119 | 0 | [116, 69] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_200209__513.json | 50.0 | missing | missing | missing | |
| 6965 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_194344__907 | 0 | 0.0 | 16.6792 | 0 | [116, 109] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231215_194344__907.json | 50.0 | 0.1 | missing | 0.9 | |
| 6966 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_202047__884 | 0 | 0.01693 | 50.1562 | 0 | [190, 501] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231213_202047__884.json | 0.0 | missing | missing | missing | |
| 6967 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_192614__884 | 0 | 0.01213 | 16.2238 | 0 | [190, 341] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_192614__884.json | 0.0 | missing | missing | missing | |
| 6968 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_192623__594 | 0 | 0.00832 | 9.63418 | 0 | [190, 214] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_192623__594.json | 0.0 | missing | missing | missing | |
| 6969 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_200138__286 | 0 | 0.00532 | 8.73588 | 0 | [190, 114] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_200138__286.json | 50.0 | missing | missing | missing | |
| 6970 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_200157__460 | 0 | 0.01081 | 18.7907 | 0 | [190, 297] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_200157__460.json | 0.0 | missing | missing | missing | |
| 6971 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview--optim | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231215_194327__433 | 0 | 0.0 | 35.6828 | 0 | [190, 366] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231215_194327__433.json | 0.0 | 0.1 | missing | 0.9 | |
| 6972 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_202259__375 | 1 | 0.00954 | 21.8738 | 2 | [339, 205] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231213_202259__375.json | 67.5 | missing | missing | missing | |
| 6973 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_192815__742 | 1 | 0.01116 | 19.5601 | 2 | [339, 259] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_192815__742.json | 67.5 | missing | missing | missing | |
| 6974 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_192829__139 | 0 | 0.0138 | 14.0452 | 0 | [339, 347] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_192829__139.json | 50.0 | missing | missing | missing | |
| 6975 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_200338__473 | 1 | 0.0117 | 16.1249 | 2 | [339, 277] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_200338__473.json | 67.5 | missing | missing | missing | |
| 6976 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_200357__259 | 1 | 0.01389 | 18.9118 | 2 | [339, 350] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_200357__259.json | 67.5 | missing | missing | missing | |
| 6977 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_194703__635 | 1 | 0.0 | 24.2074 | 2 | [339, 322] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231215_194703__635.json | 67.5 | 0.1 | missing | 0.9 | |
| 6978 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_202237__778 | 1 | 0.01091 | 26.7196 | 2 | [338, 251] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231213_202237__778.json | 67.5 | missing | missing | missing | |
| 6979 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_192740__718 | 1 | 0.01256 | 15.8597 | 2 | [338, 306] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_192740__718.json | 67.5 | missing | missing | missing | |
| 6980 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_192755__556 | 0 | 0.01274 | 14.4163 | 0 | [338, 312] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_192755__556.json | 50.0 | missing | missing | missing | |
| 6981 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_200301__756 | 0 | 0.01328 | 17.3604 | 0 | [338, 330] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_200301__756.json | 50.0 | missing | missing | missing | |
| 6982 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_200322__353 | 1 | 0.01394 | 20.2998 | 2 | [338, 352] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_200322__353.json | 67.5 | missing | missing | missing | |
| 6983 | Apple-MacBook-Pro-M1 | extract_julia_code | gpt-4-1106-preview--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_194639__931 | 1 | 0.0 | 55.618 | 2 | [338, 382] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231215_194639__931.json | 67.5 | 0.1 | missing | 0.9 | |
| 6984 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | AsIs | 1SHOT | false | false | 5 | 20231214_005800__394 | 0 | 0.0 | 10.1085 | 0 | [72, 300] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__AsIs__1SHOT__20231214_005800__394.json | 0.0 | missing | missing | missing | |
| 6985 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | AsIs | 1SHOT | false | false | 5 | 20231225_140748__716 | 0 | 0.0 | 13.5878 | 0 | [72, 404] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__AsIs__1SHOT__20231225_140748__716.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6986 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | AsIs | 1SHOT | false | false | 5 | 20231225_140757__162 | 0 | 0.0 | 8.69717 | 0 | [1, 275] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__AsIs__1SHOT__20231225_140757__162.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6987 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | InJulia | 1SHOT | true | true | 5 | 20231214_005750__948 | 0 | 0.0 | 16.2995 | 0 | [89, 476] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__InJulia__1SHOT__20231214_005750__948.json | 50.0 | missing | missing | missing | |
| 6988 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | InJulia | 1SHOT | false | false | 5 | 20231225_140722__171 | 0 | 0.0 | 18.3163 | 0 | [89, 536] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__InJulia__1SHOT__20231225_140722__171.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6989 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | InJulia | 1SHOT | false | false | 5 | 20231225_140735__895 | 0 | 0.0 | 12.7006 | 0 | [1, 393] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__InJulia__1SHOT__20231225_140735__895.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6990 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | InJulia | 1SHOT | false | false | 5 | 20231227_001712__113 | 0 | 0.0 | 20.6041 | 0 | [89, 605] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__InJulia__1SHOT__20231227_001712__113.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6991 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_005733__282 | 0 | 0.0 | 3.28608 | 0 | [118, 78] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaExpertAsk__1SHOT__20231214_005733__282.json | 50.0 | missing | missing | missing | |
| 6992 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_140659__610 | 0 | 0.0 | 10.5639 | 0 | [118, 303] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_140659__610.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6993 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_140703__499 | 0 | 0.0 | 3.98842 | 0 | [1, 127] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_140703__499.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6994 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_001651__327 | 0 | 0.0 | 10.0247 | 0 | [118, 290] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaExpertAsk__1SHOT__20231227_001651__327.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6995 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_005730__644 | 0 | 0.0 | 12.9024 | 0 | [211, 336] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_005730__644.json | 25.0 | missing | missing | missing | |
| 6996 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_140639__126 | 1 | 0.0 | 28.2002 | 1 | [229, 611] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_140639__126.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 6997 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_140649__602 | 0 | 0.0 | 9.80533 | 0 | [1, 294] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_140649__602.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6998 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_001641__340 | 0 | 0.0 | 14.7387 | 0 | [229, 250] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_001641__340.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 6999 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_005836__474 | 0 | 0.0 | 23.1775 | 0 | [11, 616] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_005836__474.json | 0.0 | missing | missing | missing | |
| 7000 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_140847__663 | 0 | 0.0 | 15.2945 | 0 | [11, 420] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_140847__663.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7001 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_140849__103 | 0 | 0.0 | 1.97269 | 0 | [1, 57] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_140849__103.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7002 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_001746__971 | 0 | 0.0 | 21.914 | 0 | [11, 595] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_001746__971.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7003 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_005813__246 | 0 | 0.0 | 13.1495 | 0 | [389, 270] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaRecapTask__1SHOT__20231214_005813__246.json | 0.0 | missing | missing | missing | |
| 7004 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_140802__171 | 0 | 0.0 | 4.85101 | 0 | [389, 36] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_140802__171.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7005 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_140831__179 | 0 | 0.0 | 29.3798 | 0 | [1, 776] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_140831__179.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7006 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_001724__134 | 0 | 0.0 | 12.2519 | 0 | [389, 250] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaRecapTask__1SHOT__20231227_001724__134.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7007 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | AsIs | 1SHOT | false | false | 5 | 20231214_072718__760 | 0 | 0.0 | 23.5008 | 0 | [72, 288] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__AsIs__1SHOT__20231214_072718__760.json | 0.0 | missing | missing | missing | |
| 7008 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | AsIs | 1SHOT | false | false | 5 | 20231225_143020__737 | 0 | 0.0 | 7.76813 | 0 | [86, 253] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__AsIs__1SHOT__20231225_143020__737.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7009 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | AsIs | 1SHOT | false | false | 5 | 20231225_143029__590 | 0 | 0.0 | 8.84015 | 0 | [86, 289] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__AsIs__1SHOT__20231225_143029__590.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7010 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | InJulia | 1SHOT | true | true | 5 | 20231214_072654__478 | 0 | 0.0 | 48.6911 | 0 | [89, 373] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__InJulia__1SHOT__20231214_072654__478.json | 50.0 | missing | missing | missing | |
| 7011 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_143003__366 | 1 | 0.0 | 9.81618 | 1 | [89, 323] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__InJulia__1SHOT__20231225_143003__366.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7012 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_143012__939 | 1 | 0.0 | 9.08471 | 2 | [89, 297] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__InJulia__1SHOT__20231225_143012__939.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7013 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | InJulia | 1SHOT | true | true | 5 | 20231227_002458__617 | 1 | 0.0 | 8.70972 | 2 | [89, 283] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__InJulia__1SHOT__20231227_002458__617.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7014 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_072605__949 | 0 | 0.0 | 55.1543 | 0 | [118, 420] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231214_072605__949.json | 0.0 | missing | missing | missing | |
| 7015 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_142942__712 | 0 | 0.0 | 7.41092 | 0 | [128, 236] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_142942__712.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7016 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_142953__702 | 0 | 0.0 | 10.3598 | 0 | [128, 334] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_142953__702.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7017 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_002450__715 | 0 | 0.0 | 9.90197 | 0 | [128, 317] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231227_002450__715.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7018 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_072510__551 | 1 | 0.0 | 106.253 | 2 | [211, 622] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231214_072510__551.json | 67.5 | missing | missing | missing | |
| 7019 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_142928__154 | 0 | 0.0 | 16.0749 | 0 | [221, 308] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_142928__154.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7020 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_142935__874 | 0 | 0.0 | 6.70834 | 0 | [221, 194] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_142935__874.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7021 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_002440__348 | 0 | 0.0 | 13.5333 | 0 | [221, 229] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231227_002440__348.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7022 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_072804__461 | 0 | 0.0 | 15.915 | 0 | [11, 435] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231214_072804__461.json | 50.0 | missing | missing | missing | |
| 7023 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_143103__727 | 0 | 0.0 | 5.50103 | 0 | [392, 124] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_143103__727.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7024 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_143118__282 | 1 | 0.0 | 14.4601 | 1 | [392, 410] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_143118__282.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7025 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_002530__770 | 0 | 0.0 | 13.837 | 0 | [392, 388] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231227_002530__770.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7026 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_072748__801 | 0 | 0.0 | 30.4187 | 0 | [389, 712] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaRecapTask__1SHOT__20231214_072748__801.json | 50.0 | missing | missing | missing | |
| 7027 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_143042__705 | 1 | 0.0 | 12.5475 | 2 | [389, 351] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_143042__705.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7028 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_143057__177 | 0 | 0.0 | 15.3 | 0 | [389, 436] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_143057__177.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7029 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_002516__849 | 1 | 0.0 | 16.8136 | 2 | [389, 480] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaRecapTask__1SHOT__20231227_002516__849.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7030 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_180952__507 | 0 | 0.0 | 17.411 | 0 | [89, 337] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_180952__507.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7031 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_180957__736 | 0 | 0.0 | 5.67614 | 0 | [89, 104] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_180957__736.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7032 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_181020__889 | 1 | 0.0 | 23.106 | 1 | [89, 448] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_181020__889.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7033 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_180855__299 | 0 | 0.0 | 11.6991 | 0 | [128, 220] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_180855__299.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7034 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_180916__174 | 0 | 0.0 | 20.7913 | 0 | [128, 398] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_180916__174.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7035 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_180934__742 | 0 | 0.0 | 18.2323 | 0 | [128, 349] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_180934__742.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7036 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_180814__324 | 0 | 0.0 | 11.9533 | 0 | [221, 213] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_180814__324.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7037 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_180830__508 | 0 | 0.0 | 15.6698 | 1 | [221, 286] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_180830__508.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7038 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_180843__625 | 0 | 0.0 | 12.8595 | 0 | [221, 231] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_180843__625.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7039 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_181138__945 | 0 | 0.0 | 15.7593 | 0 | [392, 264] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_181138__945.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7040 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_181156__689 | 0 | 0.0 | 17.5477 | 0 | [392, 298] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_181156__689.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7041 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_181207__472 | 1 | 0.0 | 10.723 | 1 | [392, 167] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_181207__472.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7042 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_181040__752 | 1 | 0.0 | 19.6361 | 1 | [389, 338] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_181040__752.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7043 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_181057__827 | 0 | 0.0 | 16.9697 | 0 | [389, 287] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_181057__827.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7044 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_181122__285 | 1 | 0.0 | 24.2607 | 1 | [389, 425] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_181122__285.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7045 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | AsIs | 1SHOT | true | true | 5 | 20231213_202512__446 | 0 | 0.00326864 | 15.914 | 0 | [84, 376] | 0.10.0-DEV | 4 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__AsIs__1SHOT__20231213_202512__446.json | 50.0 | missing | missing | missing | |
| 7046 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | AsIs | 1SHOT | true | true | 5 | 20231225_193151__519 | 0 | 0.0025729 | 6.46731 | 0 | [84, 290] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__AsIs__1SHOT__20231225_193151__519.json | 50.0 | missing | missing | missing | |
| 7047 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | AsIs | 1SHOT | true | true | 5 | 20231225_193200__783 | 1 | 0.0032201 | 8.25801 | 2 | [84, 370] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__AsIs__1SHOT__20231225_193200__783.json | 67.5 | missing | missing | missing | |
| 7048 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium--optim | AsIs | 1SHOT | true | false | 5 | 20231215_194852__196 | 0 | 0.0 | 5.77357 | 0 | [84, 261] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__AsIs__1SHOT__20231215_194852__196.json | 25.0 | 0.9 | missing | 0.3 | |
| 7049 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | InJulia | 1SHOT | true | false | 5 | 20231213_202456__116 | 0 | 0.00386731 | 18.7168 | 0 | [87, 449] | 0.10.0-DEV | 4 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__InJulia__1SHOT__20231213_202456__116.json | 25.0 | missing | missing | missing | |
| 7050 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | InJulia | 1SHOT | true | false | 5 | 20231225_193135__499 | 0 | 0.00307449 | 7.86205 | 0 | [87, 351] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__InJulia__1SHOT__20231225_193135__499.json | 25.0 | missing | missing | missing | |
| 7051 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | InJulia | 1SHOT | true | false | 5 | 20231225_193145__806 | 0 | 0.00275089 | 9.97241 | 0 | [87, 311] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__InJulia__1SHOT__20231225_193145__806.json | 25.0 | missing | missing | missing | |
| 7052 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | InJulia | 1SHOT | false | false | 5 | 20231227_200713__899 | 0 | 0.00305831 | 8.21265 | 0 | [87, 349] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__InJulia__1SHOT__20231227_200713__899.json | 0.0 | missing | missing | missing | |
| 7053 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | InJulia | 1SHOT | false | false | 5 | 20231227_200721__768 | 0 | 0.00309067 | 7.93904 | 0 | [87, 353] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__InJulia__1SHOT__20231227_200721__768.json | 0.0 | missing | missing | missing | |
| 7054 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium--optim | InJulia | 1SHOT | true | false | 5 | 20231215_194846__298 | 0 | 0.0 | 12.9568 | 0 | [87, 296] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__InJulia__1SHOT__20231215_194846__298.json | 25.0 | 0.9 | missing | 0.3 | |
| 7055 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231213_202437__892 | 0 | 0.00217663 | 9.4886 | 0 | [126, 227] | 0.10.0-DEV | 4 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231213_202437__892.json | 25.0 | missing | missing | missing | |
| 7056 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_193116__768 | 0 | 0.00271057 | 14.0171 | 0 | [126, 293] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_193116__768.json | 25.0 | missing | missing | missing | |
| 7057 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_193126__715 | 0 | 0.00389171 | 9.84133 | 0 | [126, 439] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_193126__715.json | 25.0 | missing | missing | missing | |
| 7058 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_200657__457 | 0 | 0.0023627 | 16.2987 | 0 | [126, 250] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_200657__457.json | 25.0 | missing | missing | missing | |
| 7059 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_200705__654 | 0 | 0.00273484 | 6.64976 | 0 | [126, 296] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_200705__654.json | 25.0 | missing | missing | missing | |
| 7060 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium--optim | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231215_194833__220 | 0 | 0.0 | 18.4032 | 0 | [126, 308] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231215_194833__220.json | 25.0 | 0.9 | missing | 0.3 | |
| 7061 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_202427__274 | 0 | 0.00443405 | 19.9586 | 0 | [219, 475] | 0.10.0-DEV | 4 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231213_202427__274.json | 50.0 | missing | missing | missing | |
| 7062 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_193044__419 | 0 | 0.00409427 | 23.7358 | 0 | [219, 433] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_193044__419.json | 0.0 | missing | missing | missing | |
| 7063 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_193101__697 | 0 | 0.00422371 | 17.8311 | 0 | [219, 449] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_193101__697.json | 25.0 | missing | missing | missing | |
| 7064 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_200606__622 | 0 | 0.00387584 | 18.433 | 0 | [219, 406] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_200606__622.json | 25.0 | missing | missing | missing | |
| 7065 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_200641__831 | 0 | 0.00402955 | 34.5911 | 0 | [219, 425] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_200641__831.json | 25.0 | missing | missing | missing | |
| 7066 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium--optim | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231215_194815__290 | 0 | 0.0 | 18.9035 | 0 | [219, 346] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231215_194815__290.json | 0.0 | 0.9 | missing | 0.3 | |
| 7067 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_202555__841 | 1 | 0.00497395 | 20.6553 | 2 | [389, 485] | 0.10.0-DEV | 4 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231213_202555__841.json | 67.5 | missing | missing | missing | |
| 7068 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_193257__972 | 0 | 0.00594475 | 33.0951 | 0 | [389, 605] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_193257__972.json | 25.0 | missing | missing | missing | |
| 7069 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_193337__862 | 1 | 0.00442383 | 40.3501 | 2 | [389, 417] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_193337__862.json | 67.5 | missing | missing | missing | |
| 7070 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_200810__613 | 0 | 0.00424585 | 17.5426 | 0 | [389, 395] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_200810__613.json | 50.0 | missing | missing | missing | |
| 7071 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_200827__370 | 0 | 0.00393034 | 16.8165 | 0 | [389, 356] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_200827__370.json | 25.0 | missing | missing | missing | |
| 7072 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium--optim | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231215_194917__810 | 0 | 0.0 | 14.2546 | 0 | [389, 372] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231215_194917__810.json | 25.0 | 0.9 | missing | 0.3 | |
| 7073 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_202534__381 | 1 | 0.00535417 | 22.2614 | 2 | [386, 533] | 0.10.0-DEV | 4 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231213_202534__381.json | 67.5 | missing | missing | missing | |
| 7074 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_193208__599 | 0 | 0.00410831 | 8.6864 | 0 | [386, 379] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_193208__599.json | 25.0 | missing | missing | missing | |
| 7075 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_193223__755 | 0 | 0.00522473 | 14.6537 | 0 | [386, 517] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_193223__755.json | 25.0 | missing | missing | missing | |
| 7076 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_200737__189 | 0 | 0.00571013 | 15.6561 | 0 | [386, 577] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_200737__189.json | 50.0 | missing | missing | missing | |
| 7077 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_200752__618 | 1 | 0.00396269 | 15.2086 | 2 | [386, 361] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_200752__618.json | 67.5 | missing | missing | missing | |
| 7078 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-medium--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_194902__973 | 1 | 0.0 | 10.1474 | 2 | [386, 449] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231215_194902__973.json | 67.5 | 0.9 | missing | 0.3 | |
| 7079 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | AsIs | 1SHOT | true | true | 5 | 20231213_202345__258 | 1 | 0.00107802 | 7.07255 | 1 | [86, 527] | 0.10.0-DEV | 4 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__AsIs__1SHOT__20231213_202345__258.json | 61.25 | missing | missing | missing | |
| 7080 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | AsIs | 1SHOT | true | true | 5 | 20231225_192952__762 | 1 | 0.000701662 | 4.48819 | 1 | [86, 333] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__AsIs__1SHOT__20231225_192952__762.json | 61.25 | missing | missing | missing | |
| 7081 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | AsIs | 1SHOT | false | false | 5 | 20231225_192958__181 | 0 | 0.000930582 | 6.05321 | 0 | [86, 451] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__AsIs__1SHOT__20231225_192958__181.json | 0.0 | missing | missing | missing | |
| 7082 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small--optim | AsIs | 1SHOT | true | true | 5 | 20231215_194741__714 | 1 | 0.0 | 6.19549 | 1 | [86, 463] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__AsIs__1SHOT__20231215_194741__714.json | 61.25 | 0.9 | missing | 0.3 | |
| 7083 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231213_202338__927 | 1 | 0.000540643 | 3.4105 | 2 | [89, 249] | 0.10.0-DEV | 4 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__InJulia__1SHOT__20231213_202338__927.json | 67.5 | missing | missing | missing | |
| 7084 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231225_192943__392 | 1 | 0.000781203 | 5.01957 | 1 | [89, 373] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__InJulia__1SHOT__20231225_192943__392.json | 61.25 | missing | missing | missing | |
| 7085 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231225_192947__407 | 1 | 0.000647343 | 4.22164 | 1 | [89, 304] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__InJulia__1SHOT__20231225_192947__407.json | 61.25 | missing | missing | missing | |
| 7086 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231227_200459__270 | 0 | 0.000918943 | 5.9889 | 1 | [89, 444] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__InJulia__1SHOT__20231227_200459__270.json | 56.25 | missing | missing | missing | |
| 7087 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | InJulia | 1SHOT | false | false | 5 | 20231227_200505__272 | 0 | 0.000928643 | 6.02362 | 0 | [89, 449] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__InJulia__1SHOT__20231227_200505__272.json | 0.0 | missing | missing | missing | |
| 7088 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small--optim | InJulia | 1SHOT | false | false | 5 | 20231215_194735__628 | 0 | 0.0 | 3.68195 | 0 | [89, 273] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__InJulia__1SHOT__20231215_194735__628.json | 0.0 | 0.9 | missing | 0.3 | |
| 7089 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_202335__443 | 0 | 0.00077475 | 4.88681 | 0 | [130, 356] | 0.10.0-DEV | 4 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231213_202335__443.json | 50.0 | missing | missing | missing | |
| 7090 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_192935__554 | 0 | 0.00098233 | 6.20694 | 0 | [130, 463] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_192935__554.json | 50.0 | missing | missing | missing | |
| 7091 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_192938__477 | 1 | 0.00043137 | 2.56982 | 1 | [130, 179] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_192938__477.json | 61.25 | missing | missing | missing | |
| 7092 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_200447__680 | 0 | 0.00077087 | 4.91599 | 0 | [130, 354] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_200447__680.json | 50.0 | missing | missing | missing | |
| 7093 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_200453__580 | 0 | 0.00087757 | 5.62561 | 0 | [130, 409] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_200453__580.json | 50.0 | missing | missing | missing | |
| 7094 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_194731__855 | 0 | 0.0 | 4.89224 | 0 | [130, 372] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231215_194731__855.json | 50.0 | 0.9 | missing | 0.3 | |
| 7095 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_202330__802 | 0 | 0.000971368 | 5.84857 | 0 | [224, 426] | 0.10.0-DEV | 4 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231213_202330__802.json | 50.0 | missing | missing | missing | |
| 7096 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_192924__130 | 0 | 0.000816168 | 4.79322 | 0 | [224, 346] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_192924__130.json | 0.0 | missing | missing | missing | |
| 7097 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_192929__935 | 0 | 0.000853028 | 4.98009 | 0 | [224, 365] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_192929__935.json | 0.0 | missing | missing | missing | |
| 7098 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_200437__816 | 1 | 0.000872428 | 5.2252 | 2 | [224, 375] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_200437__816.json | 67.5 | missing | missing | missing | |
| 7099 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_200442__676 | 1 | 0.000699768 | 4.2866 | 2 | [224, 286] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_200442__676.json | 67.5 | missing | missing | missing | |
| 7100 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_194727__428 | 0 | 0.0 | 4.23141 | 0 | [224, 313] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231215_194727__428.json | 50.0 | 0.9 | missing | 0.3 | |
| 7101 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_202407__350 | 1 | 0.00197505 | 12.224 | 2 | [396, 886] | 0.10.0-DEV | 4 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231213_202407__350.json | 67.5 | missing | missing | missing | |
| 7102 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_193015__857 | 1 | 0.00129411 | 7.28862 | 2 | [396, 535] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_193015__857.json | 67.5 | missing | missing | missing | |
| 7103 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_193020__930 | 0 | 0.000954612 | 4.98604 | 2 | [396, 360] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_193020__930.json | 62.5 | missing | missing | missing | |
| 7104 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_200540__955 | 1 | 0.00148035 | 8.70707 | 2 | [396, 631] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_200540__955.json | 67.5 | missing | missing | missing | |
| 7105 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_200548__196 | 1 | 0.00136977 | 7.79284 | 2 | [396, 574] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_200548__196.json | 67.5 | missing | missing | missing | |
| 7106 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small--optim | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231215_194756__144 | 0 | 0.0 | 5.97452 | 0 | [396, 446] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231215_194756__144.json | 25.0 | 0.9 | missing | 0.3 | |
| 7107 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_202355__820 | 0 | 0.00135102 | 9.33105 | 0 | [394, 565] | 0.10.0-DEV | 4 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231213_202355__820.json | 50.0 | missing | missing | missing | |
| 7108 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_193003__789 | 0 | 0.000864078 | 4.32077 | 2 | [394, 314] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_193003__789.json | 62.5 | missing | missing | missing | |
| 7109 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_193007__620 | 1 | 0.000866018 | 4.2952 | 2 | [394, 315] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_193007__620.json | 67.5 | missing | missing | missing | |
| 7110 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_200517__590 | 0 | 0.00195436 | 11.9053 | 0 | [394, 876] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_200517__590.json | 50.0 | missing | missing | missing | |
| 7111 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_200531__745 | 0 | 0.00224342 | 13.9465 | 0 | [394, 1025] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_200531__745.json | 50.0 | missing | missing | missing | |
| 7112 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-small--optim | JuliaRecapTask | 1SHOT | false | false | 5 | 20231215_194750__464 | 0 | 0.0 | 8.33527 | 0 | [394, 620] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231215_194750__464.json | 0.0 | 0.9 | missing | 0.3 | |
| 7113 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231213_202313__292 | 0 | 0.00010717 | 3.82631 | 0 | [86, 210] | 0.10.0-DEV | 4 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__AsIs__1SHOT__20231213_202313__292.json | 0.0 | missing | missing | missing | |
| 7114 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | AsIs | 1SHOT | true | true | 5 | 20231225_192857__951 | 0 | 0.000100828 | 1.83793 | 0 | [86, 196] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__AsIs__1SHOT__20231225_192857__951.json | 50.0 | missing | missing | missing | |
| 7115 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | AsIs | 1SHOT | true | true | 5 | 20231225_192859__453 | 0 | 0.000117589 | 2.10925 | 0 | [86, 233] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__AsIs__1SHOT__20231225_192859__453.json | 50.0 | missing | missing | missing | |
| 7116 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny--optim | AsIs | 1SHOT | true | true | 5 | 20231215_194713__181 | 0 | 0.0 | 3.09002 | 0 | [86, 143] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__AsIs__1SHOT__20231215_194713__181.json | 50.0 | 0.9 | missing | 0.3 | |
| 7117 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231213_202309__577 | 0 | 0.000182335 | 4.09154 | 2 | [89, 375] | 0.10.0-DEV | 4 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__InJulia__1SHOT__20231213_202309__577.json | 62.5 | missing | missing | missing | |
| 7118 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | InJulia | 1SHOT | false | false | 5 | 20231225_192852__557 | 0 | 9.4453e-5 | 2.43564 | 0 | [89, 181] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__InJulia__1SHOT__20231225_192852__557.json | 0.0 | missing | missing | missing | |
| 7119 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231225_192855__912 | 0 | 0.000108949 | 1.92664 | 0 | [89, 213] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__InJulia__1SHOT__20231225_192855__912.json | 50.0 | missing | missing | missing | |
| 7120 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | InJulia | 1SHOT | true | false | 5 | 20231227_200413__784 | 0 | 0.000107137 | 1.98981 | 0 | [89, 209] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__InJulia__1SHOT__20231227_200413__784.json | 25.0 | missing | missing | missing | |
| 7121 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231227_200416__760 | 0 | 0.000119368 | 2.1666 | 2 | [89, 236] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__InJulia__1SHOT__20231227_200416__760.json | 62.5 | missing | missing | missing | |
| 7122 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny--optim | InJulia | 1SHOT | true | true | 5 | 20231215_194709__150 | 0 | 0.0 | 1.46008 | 0 | [89, 161] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__InJulia__1SHOT__20231215_194709__150.json | 50.0 | 0.9 | missing | 0.3 | |
| 7123 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231213_202305__194 | 0 | 6.0782e-5 | 1.44451 | 0 | [130, 94] | 0.10.0-DEV | 4 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231213_202305__194.json | 0.0 | missing | missing | missing | |
| 7124 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_192840__251 | 0 | 0.00013145 | 2.20195 | 0 | [130, 250] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_192840__251.json | 50.0 | missing | missing | missing | |
| 7125 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_192850__999 | 0 | 7.5278e-5 | 10.1219 | 0 | [130, 126] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_192850__999.json | 0.0 | missing | missing | missing | |
| 7126 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_200409__187 | 0 | 6.8936e-5 | 1.10603 | 0 | [130, 112] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_200409__187.json | 50.0 | missing | missing | missing | |
| 7127 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_200411__464 | 1 | 0.000117407 | 2.04937 | 2 | [130, 219] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_200411__464.json | 67.5 | missing | missing | missing | |
| 7128 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_194708__475 | 1 | 0.0 | 1.80842 | 2 | [130, 197] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231215_194708__475.json | 67.5 | 0.9 | missing | 0.3 | |
| 7129 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_202304__509 | 0 | 9.2968e-5 | 4.11712 | 0 | [224, 136] | 0.10.0-DEV | 4 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231213_202304__509.json | 0.0 | missing | missing | missing | |
| 7130 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_192837__455 | 0 | 7.1224e-5 | 0.909416 | 0 | [224, 88] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_192837__455.json | 0.0 | missing | missing | missing | |
| 7131 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_192837__980 | 0 | 0.000136456 | 7.1797 | 0 | [224, 232] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_192837__980.json | 0.0 | missing | missing | missing | |
| 7132 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_200406__241 | 0 | 0.000118336 | 7.95558 | 0 | [224, 192] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_200406__241.json | 0.0 | missing | missing | missing | |
| 7133 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_200408__845 | 0 | 0.00014914 | 2.40317 | 0 | [224, 260] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_200408__845.json | 0.0 | missing | missing | missing | |
| 7134 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny--optim | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231215_194706__881 | 0 | 0.0 | 2.74253 | 0 | [224, 92] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231215_194706__881.json | 0.0 | 0.9 | missing | 0.3 | |
| 7135 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_202324__758 | 0 | 0.000208554 | 3.46104 | 0 | [396, 338] | 0.10.0-DEV | 4 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231213_202324__758.json | 50.0 | missing | missing | missing | |
| 7136 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_192915__454 | 0 | 0.000287376 | 4.83433 | 0 | [396, 512] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_192915__454.json | 50.0 | missing | missing | missing | |
| 7137 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_192919__726 | 0 | 0.000215802 | 3.29003 | 0 | [396, 354] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_192919__726.json | 25.0 | missing | missing | missing | |
| 7138 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_200427__785 | 0 | 0.000201306 | 3.01599 | 0 | [396, 322] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_200427__785.json | 50.0 | missing | missing | missing | |
| 7139 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_200432__567 | 0 | 0.000260649 | 4.21008 | 0 | [396, 453] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_200432__567.json | 0.0 | missing | missing | missing | |
| 7140 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_194722__539 | 0 | 0.0 | 4.50297 | 0 | [396, 480] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231215_194722__539.json | 50.0 | 0.9 | missing | 0.3 | |
| 7141 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | JuliaRecapTask | 1SHOT | true | false | 5 | 20231213_202320__739 | 0 | 0.000254933 | 7.09443 | 0 | [394, 441] | 0.10.0-DEV | 4 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231213_202320__739.json | 25.0 | missing | missing | missing | |
| 7142 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_192906__827 | 1 | 0.000322883 | 6.74428 | 2 | [394, 591] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_192906__827.json | 67.5 | missing | missing | missing | |
| 7143 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_192910__700 | 1 | 0.000276224 | 4.5902 | 2 | [394, 488] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_192910__700.json | 67.5 | missing | missing | missing | |
| 7144 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_200420__162 | 0 | 0.000269882 | 4.42274 | 0 | [394, 474] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_200420__162.json | 50.0 | missing | missing | missing | |
| 7145 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_200424__192 | 0 | 0.000249497 | 4.02072 | 0 | [394, 429] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_200424__192.json | 0.0 | missing | missing | missing | |
| 7146 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral-tiny--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_194718__385 | 0 | 0.0 | 5.08443 | 0 | [394, 597] | 0.10.0-DEV | 4 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231215_194718__385.json | 50.0 | 0.9 | missing | 0.3 | |
| 7147 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231220_074017__310 | 0 | 0.0 | 17.2184 | 0 | [72, 504] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231220_074017__310.json | 0.0 | missing | missing | missing | |
| 7148 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | true | true | 5 | 20231225_145817__427 | 1 | 0.0 | 8.87272 | 1 | [85, 220] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_145817__427.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7149 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_145828__818 | 0 | 0.0 | 10.644 | 0 | [85, 266] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_145828__818.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7150 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231220_073949__970 | 0 | 0.0 | 15.8077 | 0 | [1, 482] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231220_073949__970.json | 50.0 | missing | missing | missing | |
| 7151 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231220_074000__680 | 0 | 0.0 | 10.2106 | 0 | [1, 320] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231220_074000__680.json | 50.0 | missing | missing | missing | |
| 7152 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231225_145800__767 | 0 | 0.0 | 7.36794 | 0 | [88, 181] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_145800__767.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7153 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231225_145808__476 | 0 | 0.0 | 8.10971 | 0 | [88, 200] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_145808__476.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7154 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231227_003945__720 | 0 | 0.0 | 6.87629 | 0 | [88, 167] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_003945__720.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7155 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_235349__824 | 0 | 0.0 | 12.2304 | 0 | [118, 351] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_235349__824.json | 50.0 | missing | missing | missing | |
| 7156 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231220_073915__951 | 0 | 0.0 | 10.6741 | 0 | [135, 299] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231220_073915__951.json | 50.0 | missing | missing | missing | |
| 7157 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_145745__910 | 0 | 0.0 | 7.63909 | 0 | [129, 179] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_145745__910.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7158 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_145753__442 | 0 | 0.0 | 7.44825 | 0 | [129, 174] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_145753__442.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7159 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_003938__574 | 0 | 0.0 | 12.2066 | 0 | [129, 295] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_003938__574.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7160 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231219_235325__758 | 0 | 0.0 | 5.66995 | 0 | [1, 174] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_235325__758.json | 50.0 | missing | missing | missing | |
| 7161 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231219_235337__183 | 0 | 0.0 | 12.2291 | 0 | [1, 363] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_235337__183.json | 50.0 | missing | missing | missing | |
| 7162 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_145732__575 | 0 | 0.0 | 11.3329 | 0 | [223, 119] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_145732__575.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7163 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_145738__703 | 0 | 0.0 | 6.17309 | 0 | [223, 130] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_145738__703.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7164 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_003925__220 | 1 | 0.0 | 21.1589 | 1 | [223, 374] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_003925__220.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7165 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_145911__482 | 1 | 0.0 | 14.9973 | 0 | [396, 325] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_145911__482.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7166 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_145928__621 | 0 | 0.0 | 16.9559 | 1 | [396, 373] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_145928__621.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7167 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_003959__637 | 0 | 0.0 | 6.15009 | 0 | [396, 102] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_003959__637.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7168 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_145837__693 | 0 | 0.0 | 9.08408 | 0 | [394, 177] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_145837__693.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7169 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_145856__641 | 0 | 0.0 | 18.6276 | 0 | [394, 414] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_145856__641.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7170 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_003953__874 | 0 | 0.0 | 8.20422 | 0 | [394, 154] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_003953__874.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7171 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | false | false | 5 | 20231227_234122__825 | 0 | 0.0 | 20.1359 | 0 | [87, 635] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_234122__825.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7172 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_234133__904 | 1 | 0.0 | 10.883 | 2 | [87, 343] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_234133__904.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7173 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | false | 5 | 20231227_234141__167 | 0 | 0.0 | 8.16404 | 0 | [87, 255] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_234141__167.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7174 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_234151__399 | 0 | 0.0 | 9.45179 | 0 | [87, 297] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_234151__399.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7175 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_234201__726 | 1 | 0.0 | 9.47905 | 2 | [87, 298] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_234201__726.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7176 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_234038__996 | 0 | 0.0 | 3.15459 | 0 | [128, 85] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_234038__996.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7177 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_234042__148 | 0 | 0.0 | 4.38565 | 0 | [128, 126] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_234042__148.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7178 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_234050__630 | 0 | 0.0 | 7.93253 | 0 | [128, 242] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_234050__630.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7179 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_234055__922 | 0 | 0.0 | 4.985 | 0 | [128, 146] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_234055__922.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7180 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_234102__801 | 0 | 0.0 | 6.53412 | 0 | [128, 196] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_234102__801.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7181 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_233948__525 | 0 | 0.0 | 13.1358 | 0 | [222, 365] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_233948__525.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7182 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_234004__843 | 0 | 0.0 | 16.6248 | 0 | [222, 498] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_234004__843.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7183 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_234015__353 | 0 | 0.0 | 10.5963 | 0 | [222, 310] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_234015__353.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7184 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_234025__467 | 1 | 0.0 | 10.1018 | 2 | [222, 294] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_234025__467.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7185 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_234035__255 | 0 | 0.0 | 9.55406 | 0 | [222, 277] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_234035__255.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7186 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_234323__920 | 0 | 0.0 | 10.6055 | 0 | [395, 277] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_234323__920.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7187 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_234336__768 | 0 | 0.0 | 12.9084 | 0 | [395, 348] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_234336__768.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7188 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_234357__971 | 0 | 0.0 | 21.0842 | 0 | [395, 594] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_234357__971.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7189 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_234414__438 | 0 | 0.0 | 17.0635 | 0 | [395, 474] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_234414__438.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7190 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_234427__705 | 0 | 0.0 | 12.434 | 0 | [395, 333] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_234427__705.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7191 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_234213__156 | 0 | 0.0 | 12.0336 | 0 | [393, 321] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_234213__156.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7192 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_234226__129 | 0 | 0.0 | 12.6612 | 0 | [393, 340] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_234226__129.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7193 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_234245__182 | 0 | 0.0 | 19.0305 | 0 | [393, 533] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_234245__182.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7194 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_234301__282 | 0 | 0.0 | 15.3566 | 0 | [393, 422] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_234301__282.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7195 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_234312__552 | 0 | 0.0 | 11.0397 | 0 | [393, 290] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_234312__552.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7196 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231227_234623__545 | 0 | 0.0 | 10.3962 | 0 | [87, 257] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_234623__545.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7197 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_234631__556 | 0 | 0.0 | 8.71872 | 0 | [87, 214] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_234631__556.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7198 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_234644__542 | 0 | 0.0 | 12.1947 | 0 | [87, 303] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_234644__542.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7199 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_234649__528 | 0 | 0.0 | 4.90084 | 0 | [87, 115] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_234649__528.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7200 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_234658__186 | 0 | 0.0 | 9.4983 | 1 | [87, 234] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_234658__186.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7201 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_234530__786 | 0 | 0.0 | 5.98935 | 0 | [128, 138] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_234530__786.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7202 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_234540__807 | 0 | 0.0 | 10.2836 | 2 | [128, 249] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_234540__807.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7203 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_234554__508 | 0 | 0.0 | 13.6842 | 2 | [128, 335] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_234554__508.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7204 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_234600__926 | 0 | 0.0 | 4.79094 | 0 | [128, 107] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_234600__926.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7205 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_234612__545 | 0 | 0.0 | 12.4795 | 0 | [128, 305] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_234612__545.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7206 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_234442__126 | 0 | 0.0 | 14.9855 | 0 | [222, 331] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_234442__126.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7207 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_234455__403 | 0 | 0.0 | 12.9721 | 0 | [222, 302] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_234455__403.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7208 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_234506__992 | 1 | 0.0 | 10.7989 | 2 | [222, 247] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_234506__992.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7209 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_234516__279 | 0 | 0.0 | 9.73969 | 0 | [222, 220] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_234516__279.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7210 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_234524__183 | 0 | 0.0 | 8.31752 | 0 | [222, 184] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_234524__183.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7211 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_234843__788 | 0 | 0.0 | 17.9292 | 0 | [395, 393] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_234843__788.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7212 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_234857__729 | 0 | 0.0 | 13.6827 | 0 | [395, 289] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_234857__729.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7213 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_234909__682 | 0 | 0.0 | 12.5117 | 0 | [395, 260] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_234909__682.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7214 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_234924__245 | 0 | 0.0 | 14.9846 | 0 | [395, 321] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_234924__245.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7215 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_234937__723 | 0 | 0.0 | 12.381 | 0 | [395, 257] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_234937__723.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7216 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_234717__219 | 0 | 0.0 | 18.0223 | 0 | [393, 395] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_234717__219.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7217 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_234729__225 | 0 | 0.0 | 11.8805 | 0 | [393, 244] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_234729__225.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7218 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_234746__691 | 1 | 0.0 | 16.9101 | 2 | [393, 368] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_234746__691.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7219 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_234803__578 | 0 | 0.0 | 17.391 | 0 | [393, 380] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_234803__578.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7220 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_234825__500 | 1 | 0.0 | 21.3066 | 2 | [393, 475] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_234825__500.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7221 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | true | true | 5 | 20231226_122105__837 | 0 | 0.0 | 9.80117 | 0 | [84, 176] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_122105__837.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7222 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231226_122121__497 | 0 | 0.0 | 15.5078 | 0 | [84, 284] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_122121__497.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7223 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231226_122042__263 | 0 | 0.0 | 13.447 | 0 | [87, 245] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_122042__263.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7224 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_122055__584 | 0 | 0.0 | 12.8496 | 0 | [87, 234] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_122055__584.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7225 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_004215__438 | 0 | 0.0 | 10.3957 | 0 | [87, 187] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231227_004215__438.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7226 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_122008__781 | 0 | 0.0 | 6.7766 | 0 | [128, 115] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_122008__781.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7227 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_122028__928 | 0 | 0.0 | 20.7328 | 0 | [128, 377] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_122028__928.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7228 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_004205__462 | 0 | 0.0 | 8.8331 | 0 | [128, 154] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_004205__462.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7229 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_121945__298 | 0 | 0.0 | 5.79401 | 0 | [222, 86] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_121945__298.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7230 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_122001__677 | 0 | 0.0 | 15.6676 | 0 | [222, 271] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_122001__677.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7231 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_004156__687 | 0 | 0.0 | 30.2944 | 0 | [222, 375] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_004156__687.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7232 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_122255__284 | 0 | 0.0 | 24.0008 | 0 | [395, 400] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_122255__284.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7233 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_122318__589 | 0 | 0.0 | 22.8435 | 2 | [395, 379] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_122318__589.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7234 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_004256__336 | 0 | 0.0 | 25.3983 | 0 | [395, 424] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_004256__336.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7235 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_122152__868 | 0 | 0.0 | 31.5719 | 0 | [393, 536] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_122152__868.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7236 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_122231__148 | 1 | 0.0 | 38.8991 | 2 | [393, 666] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_122231__148.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7237 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_004231__709 | 0 | 0.0 | 15.6813 | 0 | [393, 248] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_004231__709.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7238 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_115432__612 | 0 | 0.0 | 133.053 | 0 | [88, 785] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_115432__612.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7239 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_115550__741 | 0 | 0.0 | 77.9773 | 2 | [88, 458] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_115550__741.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7240 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_115639__515 | 0 | 0.0 | 48.5566 | 2 | [88, 283] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_115639__515.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7241 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_151747__297 | 0 | 0.0 | 65.2108 | 0 | [88, 384] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_151747__297.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7242 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_151843__276 | 1 | 0.0 | 55.8891 | 2 | [88, 328] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_151843__276.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7243 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_115025__950 | 0 | 0.0 | 34.5608 | 0 | [127, 194] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_115025__950.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7244 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_115109__234 | 0 | 0.0 | 43.4343 | 0 | [127, 248] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_115109__234.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7245 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_115219__441 | 0 | 0.0 | 69.8194 | 0 | [127, 407] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_115219__441.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7246 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_151532__170 | 0 | 0.0 | 44.4433 | 0 | [127, 253] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_151532__170.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7247 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_151642__112 | 0 | 0.0 | 69.9384 | 0 | [127, 406] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_151642__112.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7248 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_114746__887 | 0 | 0.0 | 68.3729 | 0 | [220, 355] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_114746__887.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7249 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_114844__262 | 1 | 0.0 | 57.2817 | 1 | [220, 315] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_114844__262.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7250 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_114951__456 | 0 | 0.0 | 66.6715 | 0 | [220, 371] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_114951__456.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7251 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_151351__734 | 0 | 0.0 | 38.4833 | 0 | [220, 201] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_151351__734.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7252 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_151447__311 | 1 | 0.0 | 56.4804 | 1 | [220, 309] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_151447__311.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7253 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_120159__737 | 1 | 0.0 | 77.1404 | 2 | [401, 396] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_120159__737.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7254 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_120354__643 | 0 | 0.0 | 114.77 | 0 | [401, 611] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_120354__643.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7255 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_120624__767 | 0 | 0.0 | 149.229 | 0 | [401, 784] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_120624__767.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7256 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_152141__601 | 0 | 0.0 | 42.4428 | 2 | [401, 191] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_152141__601.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7257 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_152325__387 | 0 | 0.0 | 102.984 | 0 | [401, 543] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_152325__387.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7258 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_115749__856 | 0 | 0.0 | 70.3888 | 0 | [399, 356] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_115749__856.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7259 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_115926__770 | 0 | 0.0 | 96.4095 | 0 | [399, 506] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_115926__770.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7260 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_120041__760 | 0 | 0.0 | 74.756 | 2 | [399, 377] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_120041__760.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7261 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_151945__894 | 0 | 0.0 | 61.1968 | 0 | [399, 301] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_151945__894.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7262 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_152059__108 | 0 | 0.0 | 73.8718 | 0 | [399, 374] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_152059__108.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7263 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_150048__332 | 0 | 0.0 | 11.7715 | 0 | [93, 294] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231225_150048__332.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7264 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | true | true | 5 | 20231225_150055__962 | 0 | 0.0 | 6.63606 | 0 | [93, 161] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231225_150055__962.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7265 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_150030__863 | 0 | 0.0 | 9.31086 | 0 | [96, 231] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_150030__863.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7266 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_150036__828 | 1 | 0.0 | 5.83328 | 1 | [96, 140] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_150036__828.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7267 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_004032__562 | 0 | 0.0 | 6.92805 | 0 | [96, 168] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231227_004032__562.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7268 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_150013__538 | 1 | 0.0 | 6.00908 | 1 | [137, 136] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_150013__538.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7269 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_150021__170 | 0 | 0.0 | 7.36175 | 0 | [137, 171] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_150021__170.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7270 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_004025__651 | 0 | 0.0 | 3.34153 | 1 | [137, 66] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_004025__651.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7271 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_145951__411 | 0 | 0.0 | 22.7207 | 0 | [231, 386] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_145951__411.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7272 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_150007__576 | 0 | 0.0 | 16.1204 | 0 | [231, 379] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_150007__576.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7273 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_004021__953 | 0 | 0.0 | 22.2407 | 0 | [231, 382] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_004021__953.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7274 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_150152__868 | 1 | 0.0 | 31.6362 | 1 | [404, 725] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_150152__868.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7275 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_150208__853 | 0 | 0.0 | 15.7094 | 0 | [404, 341] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_150208__853.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7276 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_004049__355 | 1 | 0.0 | 9.98944 | 1 | [404, 198] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_004049__355.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7277 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_150109__871 | 1 | 0.0 | 13.9512 | 1 | [402, 298] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_150109__871.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7278 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_150120__697 | 0 | 0.0 | 11.1489 | 0 | [402, 228] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_150120__697.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7279 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_004038__971 | 1 | 0.0 | 6.66467 | 1 | [402, 115] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_004038__971.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7280 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231214_005952__154 | 0 | 0.0 | 19.0639 | 0 | [72, 554] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231214_005952__154.json | 0.0 | missing | missing | missing | |
| 7281 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | AsIs | 1SHOT | true | true | 5 | 20231225_140949__341 | 0 | 0.0 | 9.00006 | 0 | [91, 288] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231225_140949__341.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7282 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | AsIs | 1SHOT | true | true | 5 | 20231225_140957__209 | 1 | 0.0 | 8.27667 | 1 | [91, 265] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231225_140957__209.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7283 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231214_005933__225 | 0 | 0.0 | 14.7763 | 0 | [89, 433] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231214_005933__225.json | 50.0 | missing | missing | missing | |
| 7284 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231225_140931__362 | 1 | 0.0 | 5.40041 | 1 | [94, 168] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_140931__362.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7285 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231225_140939__877 | 0 | 0.0 | 7.83858 | 0 | [94, 250] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_140939__877.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7286 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231227_001824__184 | 1 | 0.0 | 9.51902 | 2 | [94, 304] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231227_001824__184.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7287 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_005918__873 | 0 | 0.0 | 12.8675 | 0 | [118, 367] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231214_005918__873.json | 50.0 | missing | missing | missing | |
| 7288 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_140916__615 | 0 | 0.0 | 11.4063 | 0 | [135, 357] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_140916__615.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7289 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_140926__232 | 1 | 0.0 | 9.27027 | 2 | [135, 288] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_140926__232.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7290 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_001814__257 | 0 | 0.0 | 11.0747 | 0 | [135, 345] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231227_001814__257.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7291 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_005905__554 | 0 | 0.0 | 28.4755 | 0 | [211, 746] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231214_005905__554.json | 50.0 | missing | missing | missing | |
| 7292 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_140900__655 | 0 | 0.0 | 11.189 | 0 | [229, 163] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_140900__655.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7293 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_140905__197 | 1 | 0.0 | 4.73552 | 2 | [229, 121] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_140905__197.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7294 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_001803__284 | 0 | 0.0 | 17.1515 | 0 | [229, 363] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231227_001803__284.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7295 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_010031__683 | 0 | 0.0 | 23.9321 | 0 | [11, 639] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231214_010031__683.json | 50.0 | missing | missing | missing | |
| 7296 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_141031__221 | 0 | 0.0 | 9.82892 | 0 | [402, 259] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_141031__221.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7297 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_141044__475 | 0 | 0.0 | 12.5201 | 0 | [402, 344] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_141044__475.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7298 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_001839__342 | 0 | 0.0 | 7.86678 | 0 | [402, 195] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231227_001839__342.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7299 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_010007__876 | 0 | 0.0 | 15.3545 | 0 | [389, 327] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231214_010007__876.json | 0.0 | missing | missing | missing | |
| 7300 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_141013__403 | 0 | 0.0 | 16.2672 | 0 | [400, 460] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_141013__403.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7301 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_141021__982 | 1 | 0.0 | 7.59333 | 2 | [400, 187] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_141021__982.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7302 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_001831__101 | 0 | 0.0 | 7.15872 | 0 | [400, 172] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231227_001831__101.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7303 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231214_073037__280 | 0 | 0.0 | 17.6149 | 0 | [72, 516] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__AsIs__1SHOT__20231214_073037__280.json | 0.0 | missing | missing | missing | |
| 7304 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231225_143429__158 | 0 | 0.0 | 16.681 | 0 | [89, 302] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__AsIs__1SHOT__20231225_143429__158.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7305 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231225_143449__504 | 0 | 0.0 | 19.2768 | 0 | [89, 351] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__AsIs__1SHOT__20231225_143449__504.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7306 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231214_073019__506 | 0 | 0.0 | 13.5043 | 0 | [89, 397] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__InJulia__1SHOT__20231214_073019__506.json | 0.0 | missing | missing | missing | |
| 7307 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_143410__389 | 0 | 0.0 | 3.40421 | 0 | [92, 50] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__InJulia__1SHOT__20231225_143410__389.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7308 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_143413__588 | 0 | 0.0 | 2.51248 | 0 | [92, 33] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__InJulia__1SHOT__20231225_143413__588.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7309 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231227_002707__693 | 0 | 0.0 | 18.5531 | 0 | [92, 336] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__InJulia__1SHOT__20231227_002707__693.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7310 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_073006__637 | 0 | 0.0 | 6.61338 | 0 | [118, 182] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231214_073006__637.json | 50.0 | missing | missing | missing | |
| 7311 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_143356__391 | 0 | 0.0 | 16.9653 | 0 | [131, 298] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_143356__391.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7312 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_143407__659 | 0 | 0.0 | 10.5739 | 0 | [131, 178] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_143407__659.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7313 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_002649__798 | 0 | 0.0 | 13.0209 | 0 | [131, 223] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231227_002649__798.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7314 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_072959__333 | 0 | 0.0 | 19.7094 | 0 | [211, 522] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231214_072959__333.json | 50.0 | missing | missing | missing | |
| 7315 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_143331__938 | 0 | 0.0 | 26.7265 | 0 | [224, 292] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_143331__938.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7316 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_143339__302 | 0 | 0.0 | 7.84411 | 0 | [224, 115] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_143339__302.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7317 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_002636__147 | 0 | 0.0 | 22.7331 | 0 | [224, 225] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231227_002636__147.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7318 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_073128__267 | 0 | 0.0 | 21.1597 | 0 | [11, 568] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231214_073128__267.json | 50.0 | missing | missing | missing | |
| 7319 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_143640__346 | 0 | 0.0 | 8.02687 | 0 | [395, 87] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_143640__346.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7320 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_143724__334 | 0 | 0.0 | 44.3876 | 0 | [395, 721] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_143724__334.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7321 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_002740__972 | 0 | 0.0 | 8.04379 | 0 | [395, 87] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231227_002740__972.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7322 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_073107__586 | 0 | 0.0 | 29.9543 | 0 | [389, 700] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231214_073107__586.json | 0.0 | missing | missing | missing | |
| 7323 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_143552__987 | 0 | 0.0 | 63.4508 | 0 | [392, 1029] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_143552__987.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7324 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_143632__215 | 0 | 0.0 | 39.7004 | 0 | [392, 643] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_143632__215.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7325 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_002732__605 | 0 | 0.0 | 25.0342 | 0 | [392, 390] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231227_002732__605.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7326 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231225_150324__330 | 0 | 0.0 | 16.7348 | 0 | [80, 637] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231225_150324__330.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7327 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231225_150359__310 | 0 | 0.0 | 35.0637 | 0 | [80, 1258] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231225_150359__310.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7328 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_150239__427 | 0 | 0.0 | 1.4493 | 0 | [83, 49] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_150239__427.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7329 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_150308__656 | 0 | 0.0 | 28.9237 | 0 | [83, 1058] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_150308__656.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7330 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_004118__760 | 0 | 0.0 | 21.3184 | 0 | [83, 793] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231227_004118__760.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7331 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_150234__967 | 0 | 0.0 | 17.3613 | 0 | [120, 651] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_150234__967.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7332 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_150237__409 | 0 | 0.0 | 3.14171 | 0 | [120, 113] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_150237__409.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7333 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_004057__396 | 0 | 0.0 | 3.34953 | 0 | [120, 121] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_004057__396.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7334 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_150212__122 | 0 | 0.0 | 4.73362 | 0 | [211, 16] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_150212__122.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7335 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_150217__454 | 0 | 0.0 | 4.18541 | 0 | [211, 140] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_150217__454.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7336 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_004053__297 | 0 | 0.0 | 4.66874 | 0 | [211, 15] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_004053__297.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7337 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_150427__597 | 0 | 0.0 | 4.62901 | 0 | [372, 131] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_150427__597.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7338 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_150445__534 | 0 | 0.0 | 17.3485 | 0 | [372, 589] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_150445__534.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7339 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_004126__601 | 0 | 0.0 | 5.37821 | 0 | [372, 158] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_004126__601.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7340 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_150406__690 | 0 | 0.0 | 7.04964 | 0 | [369, 222] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_150406__690.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7341 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_150423__570 | 0 | 0.0 | 16.191 | 0 | [369, 549] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_150423__570.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7342 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_004120__555 | 0 | 0.0 | 2.07208 | 0 | [369, 33] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_004120__555.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7343 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231214_073209__931 | 0 | 0.0 | 15.0153 | 0 | [72, 442] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231214_073209__931.json | 0.0 | missing | missing | missing | |
| 7344 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | AsIs | 1SHOT | true | true | 5 | 20231225_144126__422 | 1 | 0.0 | 38.0427 | 1 | [97, 290] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_144126__422.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7345 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231225_144204__735 | 0 | 0.0 | 38.3323 | 0 | [97, 292] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_144204__735.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7346 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231214_073154__463 | 0 | 0.0 | 10.8179 | 0 | [89, 319] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231214_073154__463.json | 50.0 | missing | missing | missing | |
| 7347 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_144014__991 | 1 | 0.0 | 33.2064 | 1 | [100, 251] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_144014__991.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7348 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_144048__259 | 1 | 0.0 | 33.3743 | 1 | [100, 252] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_144048__259.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7349 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231227_003000__604 | 1 | 0.0 | 40.4034 | 1 | [100, 305] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231227_003000__604.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7350 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_073143__209 | 0 | 0.0 | 3.30541 | 0 | [118, 79] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231214_073143__209.json | 50.0 | missing | missing | missing | |
| 7351 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_143912__770 | 1 | 0.0 | 19.2435 | 1 | [139, 132] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_143912__770.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7352 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_143940__481 | 1 | 0.0 | 27.7433 | 1 | [139, 201] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_143940__481.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7353 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_002920__103 | 1 | 0.0 | 44.7649 | 1 | [139, 335] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231227_002920__103.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7354 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_073140__178 | 0 | 0.0 | 11.3937 | 0 | [211, 293] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_073140__178.json | 0.0 | missing | missing | missing | |
| 7355 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_143812__987 | 1 | 0.0 | 47.8522 | 1 | [232, 168] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_143812__987.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7356 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_143853__615 | 0 | 0.0 | 40.7208 | 0 | [232, 287] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_143853__615.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7357 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_002835__257 | 1 | 0.0 | 54.502 | 1 | [232, 229] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_002835__257.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7358 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_073233__398 | 0 | 0.0 | 2.93157 | 0 | [11, 78] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_073233__398.json | 0.0 | missing | missing | missing | |
| 7359 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_144412__445 | 0 | 0.0 | 41.893 | 0 | [403, 266] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_144412__445.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7360 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_144439__578 | 1 | 0.0 | 25.9237 | 1 | [403, 140] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_144439__578.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7361 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_003148__688 | 1 | 0.0 | 42.9906 | 1 | [403, 272] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_003148__688.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7362 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_073230__605 | 0 | 0.0 | 21.2264 | 0 | [389, 483] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231214_073230__605.json | 50.0 | missing | missing | missing | |
| 7363 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_144248__506 | 1 | 0.0 | 43.6109 | 1 | [400, 278] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_144248__506.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7364 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_144330__537 | 1 | 0.0 | 42.1953 | 1 | [400, 267] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_144330__537.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7365 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_003105__207 | 1 | 0.0 | 63.6383 | 1 | [400, 430] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231227_003105__207.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7366 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_235050__474 | 0 | 0.0 | 11.2801 | 0 | [72, 336] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231219_235050__474.json | 0.0 | missing | missing | missing | |
| 7367 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231219_235104__393 | 0 | 0.0 | 14.3046 | 0 | [1, 439] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231219_235104__393.json | 0.0 | missing | missing | missing | |
| 7368 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | true | true | 5 | 20231219_235123__853 | 0 | 0.0 | 18.7544 | 0 | [1, 564] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231219_235123__853.json | 50.0 | missing | missing | missing | |
| 7369 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_145549__976 | 0 | 0.0 | 18.8472 | 0 | [93, 318] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231225_145549__976.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7370 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_145611__771 | 0 | 0.0 | 21.9553 | 0 | [93, 372] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231225_145611__771.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7371 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231219_235031__346 | 0 | 0.0 | 9.65595 | 0 | [1, 303] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_235031__346.json | 50.0 | missing | missing | missing | |
| 7372 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_235039__258 | 0 | 0.0 | 7.68142 | 0 | [1, 244] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_235039__258.json | 25.0 | missing | missing | missing | |
| 7373 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_145513__473 | 0 | 0.0 | 14.203 | 0 | [96, 238] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_145513__473.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7374 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_145530__520 | 0 | 0.0 | 15.9989 | 0 | [96, 269] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_145530__520.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7375 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_003826__820 | 0 | 0.0 | 17.485 | 0 | [96, 294] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231227_003826__820.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7376 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_235003__771 | 0 | 0.0 | 2.11132 | 0 | [1, 68] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_235003__771.json | 25.0 | missing | missing | missing | |
| 7377 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231219_235010__993 | 0 | 0.0 | 7.55718 | 0 | [1, 237] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_235010__993.json | 50.0 | missing | missing | missing | |
| 7378 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_145439__658 | 0 | 0.0 | 12.431 | 0 | [137, 198] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_145439__658.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7379 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_145459__964 | 0 | 0.0 | 19.9462 | 0 | [137, 328] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_145459__964.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7380 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_003808__452 | 1 | 0.0 | 14.662 | 1 | [137, 236] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_003808__452.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7381 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_234940__384 | 0 | 0.0 | 15.2812 | 0 | [1, 447] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_234940__384.json | 0.0 | missing | missing | missing | |
| 7382 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231219_234955__102 | 0 | 0.0 | 15.4767 | 0 | [1, 453] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_234955__102.json | 50.0 | missing | missing | missing | |
| 7383 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_145411__592 | 0 | 0.0 | 21.7987 | 0 | [231, 189] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_145411__592.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7384 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_145427__822 | 0 | 0.0 | 15.9172 | 0 | [231, 243] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_145427__822.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7385 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_003754__852 | 0 | 0.0 | 22.9093 | 0 | [231, 214] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_003754__852.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7386 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231219_235249__931 | 0 | 0.0 | 19.1812 | 0 | [1, 525] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_235249__931.json | 50.0 | missing | missing | missing | |
| 7387 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_235303__308 | 0 | 0.0 | 14.0671 | 0 | [1, 393] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_235303__308.json | 0.0 | missing | missing | missing | |
| 7388 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_145702__540 | 0 | 0.0 | 17.0227 | 0 | [404, 236] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_145702__540.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7389 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_145720__158 | 0 | 0.0 | 18.0215 | 0 | [404, 253] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_145720__158.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7390 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_003904__261 | 0 | 0.0 | 15.7365 | 0 | [404, 214] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_003904__261.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7391 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231219_235158__209 | 0 | 0.0 | 14.6919 | 0 | [1, 410] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_235158__209.json | 50.0 | missing | missing | missing | |
| 7392 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_235211__192 | 0 | 0.0 | 13.6435 | 0 | [1, 382] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_235211__192.json | 0.0 | missing | missing | missing | |
| 7393 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_145631__986 | 0 | 0.0 | 20.1574 | 0 | [402, 289] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_145631__986.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7394 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_145645__896 | 0 | 0.0 | 14.3253 | 0 | [402, 191] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_145645__896.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7395 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_003848__498 | 1 | 0.0 | 21.8503 | 2 | [402, 316] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_003848__498.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7396 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231214_072853__951 | 0 | 0.0 | 14.4583 | 0 | [72, 427] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__AsIs__1SHOT__20231214_072853__951.json | 0.0 | missing | missing | missing | |
| 7397 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231225_143213__121 | 0 | 0.0 | 10.0345 | 0 | [93, 560] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_143213__121.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7398 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231225_143220__888 | 0 | 0.0 | 6.65403 | 0 | [93, 377] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_143220__888.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7399 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231214_072838__341 | 0 | 0.0 | 12.0489 | 0 | [89, 356] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__InJulia__1SHOT__20231214_072838__341.json | 0.0 | missing | missing | missing | |
| 7400 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231225_143156__216 | 0 | 0.0 | 4.26577 | 0 | [96, 242] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_143156__216.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7401 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231225_143203__133 | 0 | 0.0 | 7.39367 | 0 | [96, 418] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_143203__133.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7402 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231227_002553__692 | 0 | 0.0 | 8.27899 | 0 | [96, 462] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__InJulia__1SHOT__20231227_002553__692.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7403 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_072826__702 | 0 | 0.0 | 11.215 | 0 | [118, 320] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231214_072826__702.json | 0.0 | missing | missing | missing | |
| 7404 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_143146__337 | 0 | 0.0 | 7.47473 | 0 | [133, 411] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_143146__337.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7405 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_143151__766 | 0 | 0.0 | 5.36306 | 0 | [133, 292] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_143151__766.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7406 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_002544__558 | 0 | 0.0 | 6.10256 | 0 | [133, 332] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231227_002544__558.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7407 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_072815__284 | 0 | 0.0 | 10.9111 | 0 | [211, 279] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231214_072815__284.json | 0.0 | missing | missing | missing | |
| 7408 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_143131__468 | 0 | 0.0 | 13.4335 | 0 | [221, 551] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_143131__468.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7409 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_143138__781 | 0 | 0.0 | 6.93661 | 0 | [221, 360] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_143138__781.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7410 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_002538__509 | 0 | 0.0 | 8.48354 | 0 | [221, 299] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231227_002538__509.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7411 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_072939__333 | 0 | 0.0 | 15.8275 | 0 | [11, 432] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231214_072939__333.json | 50.0 | missing | missing | missing | |
| 7412 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_143259__101 | 0 | 0.0 | 2.47409 | 0 | [383, 78] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_143259__101.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7413 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_143305__984 | 0 | 0.0 | 5.9577 | 0 | [383, 265] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_143305__984.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7414 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_002613__269 | 0 | 0.0 | 11.5303 | 0 | [383, 540] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231227_002613__269.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7415 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_072923__936 | 0 | 0.0 | 30.3348 | 0 | [389, 709] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231214_072923__936.json | 0.0 | missing | missing | missing | |
| 7416 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_143233__162 | 0 | 0.0 | 13.4047 | 0 | [381, 633] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_143233__162.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7417 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_143256__505 | 0 | 0.0 | 23.0824 | 0 | [381, 1064] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_143256__505.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7418 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_002602__160 | 0 | 0.0 | 8.99096 | 0 | [381, 417] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231227_002602__160.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7419 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231214_010124__287 | 0 | 0.0 | 15.9398 | 0 | [72, 469] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__AsIs__1SHOT__20231214_010124__287.json | 0.0 | missing | missing | missing | |
| 7420 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | AsIs | 1SHOT | true | false | 5 | 20231225_141218__159 | 0 | 0.0 | 13.4849 | 0 | [93, 435] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__AsIs__1SHOT__20231225_141218__159.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7421 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | AsIs | 1SHOT | true | true | 5 | 20231225_141230__436 | 1 | 0.0 | 12.3523 | 1 | [93, 398] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__AsIs__1SHOT__20231225_141230__436.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7422 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | InJulia | 1SHOT | false | false | 5 | 20231214_010108__132 | 0 | 0.0 | 11.5249 | 0 | [89, 340] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__InJulia__1SHOT__20231214_010108__132.json | 0.0 | missing | missing | missing | |
| 7423 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_141151__703 | 1 | 0.0 | 16.0643 | 1 | [96, 519] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_141151__703.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7424 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_141204__298 | 1 | 0.0 | 12.7557 | 1 | [96, 411] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_141204__298.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7425 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231227_001917__857 | 1 | 0.0 | 11.0773 | 1 | [96, 354] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__InJulia__1SHOT__20231227_001917__857.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7426 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_010056__237 | 0 | 0.0 | 10.9506 | 0 | [118, 312] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231214_010056__237.json | 50.0 | missing | missing | missing | |
| 7427 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_141127__714 | 0 | 0.0 | 8.59477 | 0 | [137, 262] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_141127__714.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7428 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_141135__460 | 1 | 0.0 | 7.99233 | 1 | [137, 244] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_141135__460.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7429 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_001906__246 | 1 | 0.0 | 5.88616 | 1 | [137, 174] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231227_001906__246.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7430 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_010045__955 | 0 | 0.0 | 13.9154 | 0 | [211, 362] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231214_010045__955.json | 0.0 | missing | missing | missing | |
| 7431 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_141107__385 | 0 | 0.0 | 23.548 | 0 | [231, 554] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_141107__385.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7432 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_141118__416 | 1 | 0.0 | 10.423 | 1 | [231, 306] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_141118__416.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7433 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_001900__328 | 0 | 0.0 | 20.7668 | 0 | [231, 470] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231227_001900__328.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7434 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_010157__736 | 0 | 0.0 | 13.955 | 0 | [11, 383] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231214_010157__736.json | 25.0 | missing | missing | missing | |
| 7435 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_141313__683 | 0 | 0.0 | 9.01912 | 0 | [404, 232] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_141313__683.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7436 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_141331__952 | 0 | 0.0 | 18.2488 | 0 | [404, 519] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_141331__952.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7437 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_001935__140 | 1 | 0.0 | 8.55234 | 1 | [404, 216] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231227_001935__140.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7438 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_010143__230 | 0 | 0.0 | 19.1573 | 0 | [389, 432] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231214_010143__230.json | 50.0 | missing | missing | missing | |
| 7439 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_141245__683 | 1 | 0.0 | 14.6131 | 2 | [402, 409] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_141245__683.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7440 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_141304__331 | 0 | 0.0 | 18.7693 | 0 | [402, 536] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_141304__331.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7441 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_001927__738 | 0 | 0.0 | 9.8246 | 1 | [402, 256] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231227_001927__738.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7442 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231214_010306__877 | 0 | 0.0 | 27.9812 | 0 | [72, 356] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__AsIs__1SHOT__20231214_010306__877.json | 0.0 | missing | missing | missing | |
| 7443 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231225_141934__198 | 0 | 0.0 | 60.9692 | 0 | [85, 461] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__AsIs__1SHOT__20231225_141934__198.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7444 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231225_142014__964 | 0 | 0.0 | 39.7539 | 0 | [85, 298] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__AsIs__1SHOT__20231225_142014__964.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7445 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231214_010238__247 | 0 | 0.0 | 14.2107 | 0 | [89, 419] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__InJulia__1SHOT__20231214_010238__247.json | 50.0 | missing | missing | missing | |
| 7446 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231225_141657__153 | 0 | 0.0 | 59.1145 | 2 | [88, 447] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_141657__153.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7447 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231225_141833__603 | 1 | 0.0 | 96.2979 | 2 | [88, 724] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_141833__603.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7448 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231227_002143__695 | 0 | 0.0 | 41.9031 | 1 | [88, 313] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__InJulia__1SHOT__20231227_002143__695.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7449 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_010224__749 | 0 | 0.0 | 9.71578 | 0 | [118, 276] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231214_010224__749.json | 50.0 | missing | missing | missing | |
| 7450 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_141504__479 | 1 | 0.0 | 30.4412 | 2 | [127, 220] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_141504__479.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7451 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_141557__145 | 0 | 0.0 | 53.7142 | 2 | [127, 399] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_141557__145.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7452 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_002101__135 | 1 | 0.0 | 29.0221 | 2 | [127, 208] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231227_002101__135.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7453 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_010214__362 | 0 | 0.0 | 17.3004 | 0 | [211, 458] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231214_010214__362.json | 50.0 | missing | missing | missing | |
| 7454 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_141401__480 | 0 | 0.0 | 29.2389 | 0 | [220, 8] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_141401__480.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7455 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_141433__238 | 1 | 0.0 | 32.4772 | 1 | [220, 218] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_141433__238.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7456 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_002032__816 | 0 | 0.0 | 55.9983 | 0 | [220, 226] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231227_002032__816.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7457 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_010612__558 | 0 | 0.0 | 120.555 | 0 | [11, 863] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231214_010612__558.json | 0.0 | missing | missing | missing | |
| 7458 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_142253__532 | 0 | 0.0 | 56.5415 | 0 | [401, 363] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_142253__532.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7459 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_142347__639 | 0 | 0.0 | 54.5099 | 2 | [401, 348] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_142347__639.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7460 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_002307__790 | 0 | 0.0 | 28.5303 | 2 | [401, 153] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231227_002307__790.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7461 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_010411__873 | 1 | 0.0 | 65.0708 | 0 | [389, 444] | 0.10.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231214_010411__873.json | 55.0 | missing | missing | missing | |
| 7462 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_142102__321 | 1 | 0.0 | 47.8097 | 2 | [399, 298] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_142102__321.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7463 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_142156__895 | 1 | 0.0 | 53.9988 | 2 | [399, 344] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_142156__895.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 7464 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_002239__585 | 0 | 0.0 | 55.7434 | 0 | [399, 356] | 0.10.0-DEV | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231227_002239__585.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7465 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231214_074042__293 | 0 | 0.0 | 16.0539 | 0 | [96, 466] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231214_074042__293.json | 0.0 | missing | missing | missing | |
| 7466 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | AsIs | 1SHOT | true | false | 5 | 20231225_152301__382 | 0 | 0.0 | 9.08999 | 0 | [118, 155] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_152301__382.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7467 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231225_152318__520 | 0 | 0.0 | 17.7784 | 0 | [118, 319] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_152318__520.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7468 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | InJulia | 1SHOT | false | false | 5 | 20231214_074026__805 | 0 | 0.0 | 15.1538 | 0 | [113, 435] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231214_074026__805.json | 0.0 | missing | missing | missing | |
| 7469 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | InJulia | 1SHOT | false | false | 5 | 20231225_152232__738 | 0 | 0.0 | 18.8136 | 0 | [121, 338] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_152232__738.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7470 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_152251__709 | 5 | 0.0 | 19.5704 | 4 | [121, 351] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_152251__709.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7471 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231227_005253__892 | 5 | 0.0 | 20.9975 | 4 | [121, 377] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231227_005253__892.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7472 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_074010__618 | 0 | 0.0 | 13.884 | 0 | [142, 388] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231214_074010__618.json | 25.0 | missing | missing | missing | |
| 7473 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_152210__505 | 0 | 0.0 | 5.04298 | 0 | [159, 72] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_152210__505.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7474 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_152213__919 | 0 | 0.0 | 2.85983 | 0 | [159, 30] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_152213__919.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7475 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_005232__402 | 4 | 0.0 | 15.4056 | 3 | [159, 267] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231227_005232__402.json | 88.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 7476 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_073957__302 | 0 | 0.0 | 18.4513 | 0 | [217, 486] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231214_073957__302.json | 25.0 | missing | missing | missing | |
| 7477 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_152150__958 | 5 | 0.0 | 27.7109 | 4 | [235, 299] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_152150__958.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7478 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_152205__274 | 5 | 0.0 | 15.1112 | 4 | [235, 246] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_152205__274.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7479 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_005216__628 | 5 | 0.0 | 24.5138 | 4 | [235, 246] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231227_005216__628.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7480 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_074131__418 | 0 | 0.0 | 27.613 | 0 | [11, 720] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231214_074131__418.json | 0.0 | missing | missing | missing | |
| 7481 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_152412__935 | 5 | 0.0 | 23.975 | 4 | [424, 369] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_152412__935.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7482 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_152423__708 | 0 | 0.0 | 10.8193 | 0 | [424, 134] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_152423__708.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7483 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_005322__400 | 5 | 0.0 | 13.1647 | 4 | [424, 176] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231227_005322__400.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7484 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_074104__202 | 0 | 0.0 | 22.0255 | 0 | [413, 494] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231214_074104__202.json | 25.0 | missing | missing | missing | |
| 7485 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_152334__241 | 0 | 0.0 | 15.7836 | 0 | [421, 224] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_152334__241.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7486 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_152348__219 | 3 | 0.0 | 14.0877 | 4 | [421, 193] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_152348__219.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7487 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_005308__284 | 4 | 0.0 | 15.5077 | 3 | [421, 218] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231227_005308__284.json | 88.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 7488 | NVIDIA-RTX-4090-4x | ispersonal | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_014922__577 | 5 | 0.0 | 5.5377 | 4 | [0, 415] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_014922__577.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7489 | NVIDIA-RTX-4090-4x | ispersonal | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_014926__435 | 4 | 0.0 | 3.53981 | 3 | [0, 264] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_014926__435.json | 88.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 7490 | NVIDIA-RTX-4090-4x | ispersonal | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_014929__118 | 5 | 0.0 | 3.36419 | 4 | [0, 242] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_014929__118.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7491 | NVIDIA-RTX-4090-4x | ispersonal | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_014934__782 | 5 | 0.0 | 4.48404 | 4 | [0, 322] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_014934__782.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7492 | NVIDIA-RTX-4090-4x | ispersonal | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_014938__779 | 0 | 0.0 | 4.33074 | 0 | [0, 310] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_014938__779.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7493 | NVIDIA-RTX-4090-4x | ispersonal | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240201_014845__855 | 0 | 0.0 | 3.03467 | 0 | [0, 217] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_014845__855.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7494 | NVIDIA-RTX-4090-4x | ispersonal | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_014847__128 | 5 | 0.0 | 2.54687 | 4 | [0, 184] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_014847__128.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7495 | NVIDIA-RTX-4090-4x | ispersonal | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_014851__554 | 5 | 0.0 | 3.51129 | 4 | [0, 250] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_014851__554.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7496 | NVIDIA-RTX-4090-4x | ispersonal | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_014853__762 | 3 | 0.0 | 2.01266 | 4 | [0, 145] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_014853__762.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7497 | NVIDIA-RTX-4090-4x | ispersonal | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_014857__481 | 5 | 0.0 | 3.65674 | 4 | [0, 261] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_014857__481.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7498 | NVIDIA-RTX-4090-4x | ispersonal | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_014819__513 | 3 | 0.0 | 1.45174 | 4 | [0, 106] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_014819__513.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7499 | NVIDIA-RTX-4090-4x | ispersonal | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_014821__799 | 0 | 0.0 | 2.58116 | 0 | [0, 188] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_014821__799.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7500 | NVIDIA-RTX-4090-4x | ispersonal | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_014826__559 | 5 | 0.0 | 4.73938 | 4 | [0, 343] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_014826__559.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7501 | NVIDIA-RTX-4090-4x | ispersonal | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_014829__185 | 3 | 0.0 | 2.86556 | 4 | [0, 209] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_014829__185.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7502 | NVIDIA-RTX-4090-4x | ispersonal | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_014831__369 | 0 | 0.0 | 1.28495 | 0 | [0, 94] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_014831__369.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7503 | NVIDIA-RTX-4090-4x | ispersonal | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_015036__528 | 0 | 0.0 | 3.1425 | 0 | [0, 226] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_015036__528.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7504 | NVIDIA-RTX-4090-4x | ispersonal | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240201_015037__681 | 0 | 0.0 | 0.556589 | 0 | [0, 40] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_015037__681.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7505 | NVIDIA-RTX-4090-4x | ispersonal | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240201_015040__446 | 0 | 0.0 | 2.74227 | 0 | [0, 197] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_015040__446.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7506 | NVIDIA-RTX-4090-4x | ispersonal | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240201_015044__709 | 0 | 0.0 | 4.15769 | 0 | [0, 298] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_015044__709.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7507 | NVIDIA-RTX-4090-4x | ispersonal | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240201_015046__614 | 0 | 0.0 | 1.94799 | 0 | [0, 140] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_015046__614.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7508 | NVIDIA-RTX-4090-4x | ispersonal | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_015003__170 | 4 | 0.0 | 5.33586 | 3 | [0, 383] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_015003__170.json | 88.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 7509 | NVIDIA-RTX-4090-4x | ispersonal | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_015006__819 | 3 | 0.0 | 2.86948 | 4 | [0, 206] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_015006__819.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7510 | NVIDIA-RTX-4090-4x | ispersonal | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20240201_015008__269 | 0 | 0.0 | 1.72577 | 0 | [0, 124] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_015008__269.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7511 | NVIDIA-RTX-4090-4x | ispersonal | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_015015__463 | 0 | 0.0 | 6.79648 | 0 | [0, 482] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_015015__463.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7512 | NVIDIA-RTX-4090-4x | ispersonal | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20240201_015017__436 | 0 | 0.0 | 2.75252 | 0 | [0, 198] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_015017__436.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7513 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231214_074230__894 | 0 | 0.0 | 13.4673 | 0 | [96, 391] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__AsIs__1SHOT__20231214_074230__894.json | 0.0 | missing | missing | missing | |
| 7514 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231225_152515__537 | 0 | 0.0 | 6.14609 | 0 | [92, 103] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__AsIs__1SHOT__20231225_152515__537.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7515 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231225_152527__921 | 0 | 0.0 | 12.4285 | 0 | [92, 224] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__AsIs__1SHOT__20231225_152527__921.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7516 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | InJulia | 1SHOT | true | false | 5 | 20231214_074216__436 | 0 | 0.0 | 17.2876 | 0 | [113, 495] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__InJulia__1SHOT__20231214_074216__436.json | 25.0 | missing | missing | missing | |
| 7517 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_152504__162 | 0 | 0.0 | 11.4887 | 0 | [95, 206] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_152504__162.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7518 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_152509__895 | 0 | 0.0 | 4.53152 | 0 | [95, 72] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_152509__895.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7519 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_074159__223 | 0 | 0.0 | 10.1801 | 0 | [142, 281] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231214_074159__223.json | 0.0 | missing | missing | missing | |
| 7520 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_152443__980 | 0 | 0.0 | 4.03724 | 0 | [96, 63] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_152443__980.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7521 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_152452__435 | 0 | 0.0 | 9.60543 | 0 | [96, 170] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_152452__435.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7522 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_074148__871 | 0 | 0.0 | 17.0336 | 0 | [217, 448] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231214_074148__871.json | 25.0 | missing | missing | missing | |
| 7523 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_152436__613 | 0 | 0.0 | 12.9216 | 0 | [110, 37] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_152436__613.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7524 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_152439__370 | 0 | 0.0 | 2.53689 | 0 | [110, 29] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_152439__370.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7525 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_074338__461 | 0 | 0.0 | 17.5613 | 0 | [11, 474] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231214_074338__461.json | 25.0 | missing | missing | missing | |
| 7526 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_152541__380 | 0 | 0.0 | 1.55749 | 0 | [113, 10] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_152541__380.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7527 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_152547__590 | 0 | 0.0 | 6.15518 | 0 | [113, 99] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_152547__590.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7528 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_074321__901 | 0 | 0.0 | 51.1592 | 0 | [413, 1166] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231214_074321__901.json | 25.0 | missing | missing | missing | |
| 7529 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_152533__493 | 0 | 0.0 | 6.04138 | 0 | [110, 97] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_152533__493.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7530 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_152539__660 | 0 | 0.0 | 6.127 | 0 | [110, 99] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_152539__660.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7531 | NVIDIA-RTX-4090-4x | ispersonal | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20240201_015623__578 | 0 | 0.0 | 3.97828 | 0 | [0, 144] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_015623__578.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7532 | NVIDIA-RTX-4090-4x | ispersonal | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20240201_015629__174 | 0 | 0.0 | 6.22786 | 0 | [0, 225] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_015629__174.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7533 | NVIDIA-RTX-4090-4x | ispersonal | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_015635__747 | 5 | 0.0 | 6.08543 | 4 | [0, 220] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_015635__747.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7534 | NVIDIA-RTX-4090-4x | ispersonal | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_015641__730 | 0 | 0.0 | 5.86411 | 0 | [0, 212] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_015641__730.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7535 | NVIDIA-RTX-4090-4x | ispersonal | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20240201_015650__647 | 0 | 0.0 | 8.48278 | 0 | [0, 306] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_015650__647.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7536 | NVIDIA-RTX-4090-4x | ispersonal | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_015523__581 | 0 | 0.0 | 9.89815 | 0 | [0, 356] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_015523__581.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7537 | NVIDIA-RTX-4090-4x | ispersonal | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240201_015530__757 | 0 | 0.0 | 6.74741 | 0 | [0, 243] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_015530__757.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7538 | NVIDIA-RTX-4090-4x | ispersonal | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240201_015537__886 | 0 | 0.0 | 7.11231 | 0 | [0, 256] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_015537__886.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7539 | NVIDIA-RTX-4090-4x | ispersonal | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240201_015540__956 | 0 | 0.0 | 2.81858 | 0 | [0, 102] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_015540__956.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7540 | NVIDIA-RTX-4090-4x | ispersonal | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_015547__800 | 0 | 0.0 | 7.81208 | 0 | [0, 279] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_015547__800.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7541 | NVIDIA-RTX-4090-4x | ispersonal | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_015420__692 | 5 | 0.0 | 9.79939 | 4 | [0, 346] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_015420__692.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7542 | NVIDIA-RTX-4090-4x | ispersonal | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_015429__571 | 5 | 0.0 | 9.29188 | 4 | [0, 328] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_015429__571.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7543 | NVIDIA-RTX-4090-4x | ispersonal | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_015438__389 | 0 | 0.0 | 8.92112 | 0 | [0, 315] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_015438__389.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7544 | NVIDIA-RTX-4090-4x | ispersonal | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_015445__271 | 0 | 0.0 | 6.73364 | 0 | [0, 238] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_015445__271.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7545 | NVIDIA-RTX-4090-4x | ispersonal | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_015450__336 | 0 | 0.0 | 4.72168 | 0 | [0, 167] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_015450__336.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7546 | NVIDIA-RTX-4090-4x | ispersonal | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240201_015902__522 | 0 | 0.0 | 12.0173 | 0 | [0, 419] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_015902__522.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7547 | NVIDIA-RTX-4090-4x | ispersonal | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240201_015927__574 | 0 | 0.0 | 24.7958 | 0 | [0, 866] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_015927__574.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7548 | NVIDIA-RTX-4090-4x | ispersonal | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240201_015934__593 | 0 | 0.0 | 7.37738 | 0 | [0, 262] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_015934__593.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7549 | NVIDIA-RTX-4090-4x | ispersonal | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240201_015944__478 | 0 | 0.0 | 9.67932 | 0 | [0, 344] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_015944__478.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7550 | NVIDIA-RTX-4090-4x | ispersonal | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240201_015953__398 | 0 | 0.0 | 8.92872 | 0 | [0, 317] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_015953__398.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7551 | NVIDIA-RTX-4090-4x | ispersonal | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_015744__341 | 0 | 0.0 | 4.61244 | 0 | [0, 162] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_015744__341.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7552 | NVIDIA-RTX-4090-4x | ispersonal | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_015748__636 | 0 | 0.0 | 3.35016 | 0 | [0, 118] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_015748__636.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7553 | NVIDIA-RTX-4090-4x | ispersonal | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_015805__115 | 0 | 0.0 | 17.5026 | 0 | [0, 611] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_015805__115.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7554 | NVIDIA-RTX-4090-4x | ispersonal | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20240201_015810__820 | 0 | 0.0 | 4.66717 | 0 | [0, 164] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_015810__820.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7555 | NVIDIA-RTX-4090-4x | ispersonal | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_015821__266 | 0 | 0.0 | 11.3479 | 0 | [0, 397] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_015821__266.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7556 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240201_014325__395 | 0 | 0.0 | 10.7833 | 0 | [0, 265] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_014325__395.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7557 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240201_014333__116 | 0 | 0.0 | 8.6918 | 0 | [0, 214] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_014333__116.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7558 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240201_014340__292 | 0 | 0.0 | 6.32463 | 0 | [0, 156] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_014340__292.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7559 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240201_014346__578 | 0 | 0.0 | 6.60777 | 0 | [0, 162] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_014346__578.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7560 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 5 | 20240201_014359__780 | 5 | 0.0 | 12.254 | 4 | [0, 301] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_014359__780.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7561 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240201_014156__323 | 0 | 0.0 | 14.2151 | 0 | [0, 345] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_014156__323.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7562 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_014159__870 | 0 | 0.0 | 2.56407 | 0 | [0, 62] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_014159__870.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7563 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_014206__789 | 0 | 0.0 | 6.96133 | 0 | [0, 169] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_014206__789.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7564 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_014216__827 | 0 | 0.0 | 10.5281 | 0 | [0, 256] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_014216__827.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7565 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_014231__489 | 5 | 0.0 | 15.1996 | 4 | [0, 372] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_014231__489.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7566 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_013952__713 | 0 | 0.0 | 7.72296 | 0 | [0, 189] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_013952__713.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7567 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_014004__404 | 0 | 0.0 | 11.3142 | 0 | [0, 277] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_014004__404.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7568 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_014012__359 | 5 | 0.0 | 8.24754 | 4 | [0, 202] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_014012__359.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7569 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_014026__307 | 0 | 0.0 | 13.8055 | 0 | [0, 337] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_014026__307.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7570 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_014034__958 | 0 | 0.0 | 8.57193 | 0 | [0, 210] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_014034__958.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7571 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_014716__832 | 0 | 0.0 | 5.26717 | 0 | [0, 128] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_014716__832.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7572 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_014729__455 | 0 | 0.0 | 13.5069 | 0 | [0, 326] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_014729__455.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7573 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_014734__182 | 0 | 0.0 | 4.81047 | 0 | [0, 117] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_014734__182.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7574 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_014745__442 | 4 | 0.0 | 10.3688 | 3 | [0, 251] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_014745__442.json | 88.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 7575 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_014755__254 | 0 | 0.0 | 10.2923 | 0 | [0, 249] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_014755__254.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7576 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_014521__656 | 5 | 0.0 | 17.4403 | 4 | [0, 421] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_014521__656.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7577 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_014532__902 | 5 | 0.0 | 11.2848 | 4 | [0, 273] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_014532__902.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7578 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20240201_014541__740 | 0 | 0.0 | 8.87891 | 0 | [0, 215] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_014541__740.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7579 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_014601__932 | 0 | 0.0 | 19.6625 | 0 | [0, 469] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_014601__932.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7580 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_014614__296 | 5 | 0.0 | 13.0633 | 4 | [0, 314] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_014614__296.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7581 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20240201_013119__848 | 0 | 0.0 | 15.9942 | 0 | [0, 299] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_013119__848.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7582 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_013124__112 | 0 | 0.0 | 5.171 | 0 | [0, 97] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_013124__112.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7583 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_013138__260 | 5 | 0.0 | 13.4112 | 4 | [0, 251] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_013138__260.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7584 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_013154__988 | 5 | 0.0 | 16.8469 | 4 | [0, 315] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_013154__988.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7585 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20240201_013212__754 | 0 | 0.0 | 17.5487 | 0 | [0, 328] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_013212__754.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7586 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_012851__287 | 5 | 0.0 | 24.0045 | 4 | [0, 446] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_012851__287.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7587 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_012854__599 | 0 | 0.0 | 2.67419 | 0 | [0, 50] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_012854__599.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7588 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_012916__829 | 5 | 0.0 | 21.9997 | 4 | [0, 408] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_012916__829.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7589 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_012924__444 | 5 | 0.0 | 8.39567 | 4 | [0, 155] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_012924__444.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7590 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_012937__217 | 0 | 0.0 | 12.8292 | 0 | [0, 238] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_012937__217.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7591 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_012622__840 | 0 | 0.0 | 10.6647 | 0 | [0, 197] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_012622__840.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7592 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_012634__515 | 0 | 0.0 | 11.417 | 0 | [0, 211] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_012634__515.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7593 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_012642__730 | 4 | 0.0 | 8.2265 | 3 | [0, 152] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_012642__730.json | 88.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 7594 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_012653__979 | 0 | 0.0 | 10.8719 | 0 | [0, 201] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_012653__979.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7595 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_012704__886 | 5 | 0.0 | 11.1479 | 4 | [0, 206] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_012704__886.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7596 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_013644__224 | 5 | 0.0 | 34.0529 | 4 | [0, 624] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_013644__224.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7597 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240201_013702__832 | 0 | 0.0 | 17.7894 | 0 | [0, 328] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_013702__832.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7598 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_013722__349 | 5 | 0.0 | 20.5173 | 4 | [0, 378] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_013722__349.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7599 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_013741__761 | 5 | 0.0 | 19.1515 | 4 | [0, 352] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_013741__761.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7600 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_013753__699 | 0 | 0.0 | 11.7065 | 0 | [0, 216] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_013753__699.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7601 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_013348__813 | 5 | 0.0 | 28.8098 | 4 | [0, 529] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_013348__813.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7602 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_013406__360 | 5 | 0.0 | 17.764 | 4 | [0, 325] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_013406__360.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7603 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_013418__563 | 0 | 0.0 | 12.5434 | 0 | [0, 230] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_013418__563.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7604 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_013432__157 | 5 | 0.0 | 13.3594 | 4 | [0, 245] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_013432__157.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7605 | NVIDIA-RTX-4090-4x | ispersonal | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_013447__596 | 0 | 0.0 | 15.5655 | 0 | [0, 285] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_013447__596.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7606 | NVIDIA-RTX-4090-4x | ispersonal | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_015143__558 | 0 | 0.0 | 3.37213 | 0 | [0, 403] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_015143__558.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7607 | NVIDIA-RTX-4090-4x | ispersonal | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20240201_015145__610 | 0 | 0.0 | 1.71275 | 0 | [0, 206] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_015145__610.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7608 | NVIDIA-RTX-4090-4x | ispersonal | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_015150__266 | 1 | 0.0 | 4.43246 | 1 | [0, 528] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_015150__266.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7609 | NVIDIA-RTX-4090-4x | ispersonal | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_015153__271 | 1 | 0.0 | 3.41258 | 1 | [0, 408] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_015153__271.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7610 | NVIDIA-RTX-4090-4x | ispersonal | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20240201_015154__271 | 0 | 0.0 | 0.901967 | 0 | [0, 104] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_015154__271.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7611 | NVIDIA-RTX-4090-4x | ispersonal | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_015122__432 | 0 | 0.0 | 0.376524 | 0 | [0, 45] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_015122__432.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7612 | NVIDIA-RTX-4090-4x | ispersonal | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_015122__466 | 5 | 0.0 | 2.88142 | 4 | [0, 343] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_015122__466.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7613 | NVIDIA-RTX-4090-4x | ispersonal | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_015123__171 | 0 | 0.0 | 0.374633 | 0 | [0, 45] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_015123__171.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7614 | NVIDIA-RTX-4090-4x | ispersonal | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_015124__465 | 0 | 0.0 | 1.29639 | 0 | [0, 156] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_015124__465.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7615 | NVIDIA-RTX-4090-4x | ispersonal | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240201_015128__271 | 0 | 0.0 | 3.89851 | 0 | [0, 461] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_015128__271.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7616 | NVIDIA-RTX-4090-4x | ispersonal | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_015104__609 | 0 | 0.0 | 2.36842 | 1 | [0, 273] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_015104__609.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7617 | NVIDIA-RTX-4090-4x | ispersonal | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_015108__177 | 3 | 0.0 | 4.13578 | 4 | [0, 472] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_015108__177.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7618 | NVIDIA-RTX-4090-4x | ispersonal | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_015109__891 | 0 | 0.0 | 0.849629 | 0 | [0, 98] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_015109__891.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7619 | NVIDIA-RTX-4090-4x | ispersonal | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_015111__954 | 4 | 0.0 | 2.45752 | 3 | [0, 283] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_015111__954.json | 88.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 7620 | NVIDIA-RTX-4090-4x | ispersonal | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_015113__773 | 0 | 0.0 | 1.42998 | 0 | [0, 165] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_015113__773.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7621 | NVIDIA-RTX-4090-4x | ispersonal | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_015246__683 | 0 | 0.0 | 4.25538 | 0 | [0, 477] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_015246__683.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7622 | NVIDIA-RTX-4090-4x | ispersonal | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240201_015250__836 | 0 | 0.0 | 3.88681 | 0 | [0, 446] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_015250__836.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7623 | NVIDIA-RTX-4090-4x | ispersonal | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240201_015252__184 | 0 | 0.0 | 2.25912 | 0 | [0, 261] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_015252__184.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7624 | NVIDIA-RTX-4090-4x | ispersonal | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240201_015256__295 | 0 | 0.0 | 3.73044 | 0 | [0, 428] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_015256__295.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7625 | NVIDIA-RTX-4090-4x | ispersonal | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240201_015259__613 | 0 | 0.0 | 2.83818 | 0 | [0, 331] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_015259__613.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7626 | NVIDIA-RTX-4090-4x | ispersonal | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_015213__692 | 1 | 0.0 | 3.10793 | 1 | [0, 362] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_015213__692.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7627 | NVIDIA-RTX-4090-4x | ispersonal | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_015216__990 | 5 | 0.0 | 2.6366 | 4 | [0, 307] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_015216__990.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7628 | NVIDIA-RTX-4090-4x | ispersonal | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20240201_015219__418 | 0 | 0.0 | 2.49473 | 0 | [0, 290] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_015219__418.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7629 | NVIDIA-RTX-4090-4x | ispersonal | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20240201_015221__468 | 0 | 0.0 | 2.19764 | 0 | [0, 243] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_015221__468.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7630 | NVIDIA-RTX-4090-4x | ispersonal | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_015225__349 | 1 | 0.0 | 4.22681 | 1 | [0, 469] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_015225__349.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7631 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_154726__805 | 0 | 0.0 | 64.9333 | 0 | [108, 388] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_154726__805.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7632 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_154816__618 | 0 | 0.0 | 50.0671 | 0 | [108, 297] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_154816__618.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7633 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_154526__414 | 0 | 0.0 | 49.6007 | 0 | [111, 294] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_154526__414.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7634 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_154621__873 | 0 | 0.0 | 54.3269 | 0 | [111, 323] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_154621__873.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7635 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_010516__563 | 0 | 0.0 | 59.3721 | 0 | [111, 354] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_010516__563.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7636 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_154337__895 | 0 | 0.0 | 49.6022 | 0 | [152, 288] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_154337__895.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7637 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_154437__150 | 0 | 0.0 | 59.5843 | 0 | [152, 350] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_154437__150.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7638 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_010416__137 | 0 | 0.0 | 85.6099 | 0 | [152, 507] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_010416__137.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7639 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_154156__827 | 0 | 0.0 | 87.4402 | 0 | [226, 347] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_154156__827.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7640 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_154247__146 | 5 | 0.0 | 50.6823 | 4 | [226, 278] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_154247__146.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7641 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_010251__216 | 5 | 0.0 | 85.4935 | 4 | [226, 347] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_010251__216.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7642 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_155155__608 | 5 | 0.0 | 71.8722 | 4 | [440, 366] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_155155__608.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7643 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_155254__306 | 0 | 0.0 | 59.3767 | 0 | [440, 296] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_155254__306.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7644 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_010749__323 | 0 | 0.0 | 84.2359 | 0 | [440, 443] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_010749__323.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7645 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_154931__397 | 5 | 0.0 | 74.5578 | 4 | [438, 387] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_154931__397.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7646 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_155042__128 | 0 | 0.0 | 71.3681 | 0 | [438, 368] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_155042__128.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7647 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_010625__652 | 0 | 0.0 | 68.8174 | 0 | [438, 352] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_010625__652.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7648 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_011444__132 | 0 | 0.0 | 7.45332 | 0 | [106, 282] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_011444__132.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7649 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_120723__832 | 0 | 0.0 | 12.0675 | 0 | [106, 456] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_120723__832.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7650 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_120730__308 | 0 | 0.0 | 7.3466 | 0 | [106, 278] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_120730__308.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7651 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_120742__112 | 0 | 0.0 | 11.8559 | 0 | [106, 446] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_120742__112.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7652 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_011436__614 | 0 | 0.0 | 6.0428 | 0 | [143, 222] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_011436__614.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7653 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_120658__305 | 0 | 0.0 | 7.39594 | 0 | [143, 274] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_120658__305.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7654 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_120703__722 | 0 | 0.0 | 5.41152 | 0 | [143, 196] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_120703__722.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7655 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_120711__429 | 0 | 0.0 | 7.72115 | 0 | [143, 286] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_120711__429.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7656 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_011430__358 | 0 | 0.0 | 12.5438 | 0 | [215, 330] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_011430__358.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7657 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_120635__931 | 0 | 0.0 | 10.9537 | 0 | [215, 257] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_120635__931.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7658 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_120640__213 | 0 | 0.0 | 4.92063 | 0 | [215, 167] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_120640__213.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7659 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_120650__702 | 0 | 0.0 | 10.4192 | 0 | [215, 373] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_120650__702.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7660 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_011502__462 | 0 | 0.0 | 6.78554 | 0 | [395, 205] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_011502__462.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7661 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_120823__839 | 0 | 0.0 | 8.533 | 0 | [395, 268] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_120823__839.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7662 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_120830__545 | 0 | 0.0 | 7.02728 | 0 | [395, 210] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_120830__545.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7663 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_120844__520 | 0 | 0.0 | 14.7818 | 0 | [395, 486] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_120844__520.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7664 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_011456__993 | 0 | 0.0 | 11.5829 | 0 | [392, 378] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_011456__993.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7665 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_120749__482 | 0 | 0.0 | 6.16703 | 0 | [392, 180] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_120749__482.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7666 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_120801__525 | 0 | 0.0 | 12.6417 | 0 | [392, 411] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_120801__525.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7667 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_120814__532 | 0 | 0.0 | 12.8522 | 0 | [392, 419] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_120814__532.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7668 | Apple-MacBook-Pro-M1 | ispersonal | gemini-1.0-pro-latest | InJulia | 1SHOT | false | false | 5 | 20240217_110902__523 | 0 | 0.0 | 5.53254 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_110902__523.json | 0.0 | missing | missing | missing | |
| 7669 | Apple-MacBook-Pro-M1 | ispersonal | gemini-1.0-pro-latest | InJulia | 1SHOT | false | false | 5 | 20240217_110904__387 | 0 | 0.0 | 1.53284 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_110904__387.json | 0.0 | missing | missing | missing | |
| 7670 | Apple-MacBook-Pro-M1 | ispersonal | gemini-1.0-pro-latest | InJulia | 1SHOT | true | false | 5 | 20240217_110906__897 | 0 | 0.0 | 2.05515 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_110906__897.json | 25.0 | missing | missing | missing | |
| 7671 | Apple-MacBook-Pro-M1 | ispersonal | gemini-1.0-pro-latest | InJulia | 1SHOT | false | false | 5 | 20240217_110910__830 | 0 | 0.0 | 4.77889 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_110910__830.json | 0.0 | missing | missing | missing | |
| 7672 | Apple-MacBook-Pro-M1 | ispersonal | gemini-1.0-pro-latest | InJulia | 1SHOT | true | false | 5 | 20240217_110919__757 | 0 | 0.0 | 8.59809 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_110919__757.json | 25.0 | missing | missing | missing | |
| 7673 | Apple-MacBook-Pro-M1 | ispersonal | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240217_110827__580 | 0 | 0.0 | 2.48585 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_110827__580.json | 0.0 | missing | missing | missing | |
| 7674 | Apple-MacBook-Pro-M1 | ispersonal | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240217_110829__567 | 0 | 0.0 | 1.97235 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_110829__567.json | 25.0 | missing | missing | missing | |
| 7675 | Apple-MacBook-Pro-M1 | ispersonal | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240217_110832__695 | 0 | 0.0 | 2.80954 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_110832__695.json | 25.0 | missing | missing | missing | |
| 7676 | Apple-MacBook-Pro-M1 | ispersonal | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240217_110835__273 | 0 | 0.0 | 2.89346 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_110835__273.json | 0.0 | missing | missing | missing | |
| 7677 | Apple-MacBook-Pro-M1 | ispersonal | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240217_110837__319 | 0 | 0.0 | 2.17744 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_110837__319.json | 25.0 | missing | missing | missing | |
| 7678 | Apple-MacBook-Pro-M1 | ispersonal | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240217_110752__491 | 0 | 0.0 | 3.4982 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_110752__491.json | 25.0 | missing | missing | missing | |
| 7679 | Apple-MacBook-Pro-M1 | ispersonal | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240217_110757__210 | 0 | 0.0 | 4.87905 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_110757__210.json | 25.0 | missing | missing | missing | |
| 7680 | Apple-MacBook-Pro-M1 | ispersonal | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240217_110801__856 | 0 | 0.0 | 4.1396 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_110801__856.json | 0.0 | missing | missing | missing | |
| 7681 | Apple-MacBook-Pro-M1 | ispersonal | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240217_110807__147 | 0 | 0.0 | 5.22331 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_110807__147.json | 0.0 | missing | missing | missing | |
| 7682 | Apple-MacBook-Pro-M1 | ispersonal | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240217_110811__947 | 0 | 0.0 | 4.87996 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_110811__947.json | 25.0 | missing | missing | missing | |
| 7683 | Apple-MacBook-Pro-M1 | ispersonal | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240217_111025__978 | 0 | 0.0 | 3.1813 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_111025__978.json | 25.0 | missing | missing | missing | |
| 7684 | Apple-MacBook-Pro-M1 | ispersonal | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240217_111040__223 | 0 | 0.0 | 15.609 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_111040__223.json | 25.0 | missing | missing | missing | |
| 7685 | Apple-MacBook-Pro-M1 | ispersonal | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240217_111044__923 | 0 | 0.0 | 3.29861 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_111044__923.json | 0.0 | missing | missing | missing | |
| 7686 | Apple-MacBook-Pro-M1 | ispersonal | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240217_111052__396 | 0 | 0.0 | 8.32096 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_111052__396.json | 25.0 | missing | missing | missing | |
| 7687 | Apple-MacBook-Pro-M1 | ispersonal | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240217_111055__770 | 0 | 0.0 | 3.13738 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_111055__770.json | 25.0 | missing | missing | missing | |
| 7688 | Apple-MacBook-Pro-M1 | ispersonal | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | false | 5 | 20240217_110949__481 | 0 | 0.0 | 8.72087 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_110949__481.json | 25.0 | missing | missing | missing | |
| 7689 | Apple-MacBook-Pro-M1 | ispersonal | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20240217_110951__888 | 0 | 0.0 | 2.64453 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_110951__888.json | 0.0 | missing | missing | missing | |
| 7690 | Apple-MacBook-Pro-M1 | ispersonal | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | false | 5 | 20240217_110955__379 | 0 | 0.0 | 3.74915 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_110955__379.json | 25.0 | missing | missing | missing | |
| 7691 | Apple-MacBook-Pro-M1 | ispersonal | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20240217_110959__729 | 0 | 0.0 | 3.78697 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_110959__729.json | 0.0 | missing | missing | missing | |
| 7692 | Apple-MacBook-Pro-M1 | ispersonal | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | false | 5 | 20240217_111006__310 | 0 | 0.0 | 6.83883 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_111006__310.json | 25.0 | missing | missing | missing | |
| 7693 | Apple-MacBook-Pro-M1 | ispersonal | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 5 | 20240223_232606__900 | 0 | 0.0 | 8.30811 | 0 | [0, 127] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_232606__900.json | 0.0 | missing | missing | missing | |
| 7694 | Apple-MacBook-Pro-M1 | ispersonal | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 5 | 20240223_232615__914 | 0 | 0.0 | 8.32864 | 0 | [0, 126] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_232615__914.json | 0.0 | missing | missing | missing | |
| 7695 | Apple-MacBook-Pro-M1 | ispersonal | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 5 | 20240223_232628__469 | 0 | 0.0 | 12.7998 | 0 | [0, 194] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_232628__469.json | 0.0 | missing | missing | missing | |
| 7696 | Apple-MacBook-Pro-M1 | ispersonal | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 5 | 20240223_232636__723 | 0 | 0.0 | 8.17178 | 0 | [0, 128] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_232636__723.json | 0.0 | missing | missing | missing | |
| 7697 | Apple-MacBook-Pro-M1 | ispersonal | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 5 | 20240223_232655__732 | 0 | 0.0 | 18.8471 | 0 | [0, 285] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_232655__732.json | 0.0 | missing | missing | missing | |
| 7698 | Apple-MacBook-Pro-M1 | ispersonal | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240223_232422__929 | 0 | 0.0 | 11.0662 | 0 | [0, 173] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_232422__929.json | 25.0 | missing | missing | missing | |
| 7699 | Apple-MacBook-Pro-M1 | ispersonal | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240223_232430__177 | 0 | 0.0 | 7.77844 | 0 | [0, 116] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_232430__177.json | 0.0 | missing | missing | missing | |
| 7700 | Apple-MacBook-Pro-M1 | ispersonal | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240223_232442__990 | 0 | 0.0 | 12.7417 | 0 | [0, 196] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_232442__990.json | 25.0 | missing | missing | missing | |
| 7701 | Apple-MacBook-Pro-M1 | ispersonal | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240223_232450__736 | 0 | 0.0 | 7.6263 | 0 | [0, 116] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_232450__736.json | 0.0 | missing | missing | missing | |
| 7702 | Apple-MacBook-Pro-M1 | ispersonal | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240223_232500__558 | 0 | 0.0 | 10.0156 | 0 | [0, 154] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_232500__558.json | 25.0 | missing | missing | missing | |
| 7703 | Apple-MacBook-Pro-M1 | ispersonal | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240223_232238__162 | 0 | 0.0 | 17.5956 | 0 | [0, 268] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_232238__162.json | 0.0 | missing | missing | missing | |
| 7704 | Apple-MacBook-Pro-M1 | ispersonal | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240223_232254__477 | 0 | 0.0 | 16.879 | 0 | [0, 258] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_232254__477.json | 0.0 | missing | missing | missing | |
| 7705 | Apple-MacBook-Pro-M1 | ispersonal | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240223_232310__127 | 0 | 0.0 | 15.576 | 0 | [0, 235] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_232310__127.json | 0.0 | missing | missing | missing | |
| 7706 | Apple-MacBook-Pro-M1 | ispersonal | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240223_232324__354 | 0 | 0.0 | 13.8901 | 0 | [0, 211] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_232324__354.json | 25.0 | missing | missing | missing | |
| 7707 | Apple-MacBook-Pro-M1 | ispersonal | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240223_232335__146 | 0 | 0.0 | 11.1418 | 0 | [0, 170] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_232335__146.json | 25.0 | missing | missing | missing | |
| 7708 | Apple-MacBook-Pro-M1 | ispersonal | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240223_233241__423 | 0 | 0.0 | 20.6265 | 0 | [0, 308] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_233241__423.json | 0.0 | missing | missing | missing | |
| 7709 | Apple-MacBook-Pro-M1 | ispersonal | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240223_233316__704 | 0 | 0.0 | 34.8043 | 0 | [0, 523] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_233316__704.json | 0.0 | missing | missing | missing | |
| 7710 | Apple-MacBook-Pro-M1 | ispersonal | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240223_233331__493 | 0 | 0.0 | 15.0864 | 0 | [0, 231] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_233331__493.json | 25.0 | missing | missing | missing | |
| 7711 | Apple-MacBook-Pro-M1 | ispersonal | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240223_233349__860 | 0 | 0.0 | 18.8071 | 0 | [0, 284] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_233349__860.json | 0.0 | missing | missing | missing | |
| 7712 | Apple-MacBook-Pro-M1 | ispersonal | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240223_233412__819 | 0 | 0.0 | 22.3292 | 0 | [0, 338] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_233412__819.json | 25.0 | missing | missing | missing | |
| 7713 | Apple-MacBook-Pro-M1 | ispersonal | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240223_232847__808 | 0 | 0.0 | 20.3814 | 0 | [0, 310] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_232847__808.json | 0.0 | missing | missing | missing | |
| 7714 | Apple-MacBook-Pro-M1 | ispersonal | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20240223_232902__490 | 0 | 0.0 | 14.889 | 0 | [0, 226] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_232902__490.json | 25.0 | missing | missing | missing | |
| 7715 | Apple-MacBook-Pro-M1 | ispersonal | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240223_232927__304 | 0 | 0.0 | 24.4591 | 0 | [0, 373] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_232927__304.json | 0.0 | missing | missing | missing | |
| 7716 | Apple-MacBook-Pro-M1 | ispersonal | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20240223_232942__240 | 0 | 0.0 | 15.1534 | 0 | [0, 229] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_232942__240.json | 25.0 | missing | missing | missing | |
| 7717 | Apple-MacBook-Pro-M1 | ispersonal | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240223_233001__902 | 0 | 0.0 | 19.4799 | 0 | [0, 294] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_233001__902.json | 0.0 | missing | missing | missing | |
| 7718 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 5 | 20231213_202619__983 | 0 | 0.0004695 | 7.03919 | 0 | [96, 281] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231213_202619__983.json | 0.0 | missing | missing | missing | |
| 7719 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 5 | 20231225_193401__345 | 0 | 0.000456 | 4.09274 | 0 | [96, 272] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_193401__345.json | 0.0 | missing | missing | missing | |
| 7720 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 5 | 20231225_193405__414 | 0 | 0.0004665 | 4.65622 | 0 | [96, 279] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_193405__414.json | 0.0 | missing | missing | missing | |
| 7721 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo--optim | AsIs | 1SHOT | false | false | 5 | 20231215_194937__879 | 0 | 0.0 | 5.07468 | 0 | [96, 248] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231215_194937__879.json | 0.0 | 0.5 | missing | 0.5 | |
| 7722 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231213_202611__824 | 0 | 0.0004275 | 6.47393 | 0 | [99, 252] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231213_202611__824.json | 50.0 | missing | missing | missing | |
| 7723 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231225_193353__909 | 5 | 0.0004005 | 3.64312 | 4 | [99, 234] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_193353__909.json | 100.0 | missing | missing | missing | |
| 7724 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231225_193357__417 | 0 | 0.000399 | 3.57198 | 0 | [99, 233] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_193357__417.json | 50.0 | missing | missing | missing | |
| 7725 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231227_200842__448 | 5 | 0.0004215 | 4.08748 | 4 | [99, 248] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_200842__448.json | 100.0 | missing | missing | missing | |
| 7726 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231227_200848__529 | 5 | 0.0005445 | 5.50975 | 4 | [99, 330] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_200848__529.json | 100.0 | missing | missing | missing | |
| 7727 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo--optim | InJulia | 1SHOT | true | true | 5 | 20231215_194932__983 | 0 | 0.0 | 4.89469 | 0 | [99, 238] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231215_194932__983.json | 50.0 | 0.5 | missing | 0.5 | |
| 7728 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_202605__220 | 0 | 0.000337 | 4.02516 | 0 | [134, 180] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231213_202605__220.json | 50.0 | missing | missing | missing | |
| 7729 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_193346__564 | 0 | 0.000295 | 2.16388 | 0 | [134, 152] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_193346__564.json | 25.0 | missing | missing | missing | |
| 7730 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_193349__471 | 0 | 0.000358 | 2.82783 | 0 | [134, 194] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_193349__471.json | 25.0 | missing | missing | missing | |
| 7731 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_200836__330 | 0 | 0.0003955 | 3.53745 | 0 | [134, 219] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_200836__330.json | 50.0 | missing | missing | missing | |
| 7732 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_200838__331 | 0 | 0.000172 | 1.35019 | 0 | [134, 70] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_200838__331.json | 25.0 | missing | missing | missing | |
| 7733 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_194927__306 | 0 | 0.0 | 4.67034 | 0 | [134, 223] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231215_194927__306.json | 50.0 | 0.5 | missing | 0.5 | |
| 7734 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231213_202601__997 | 0 | 0.000455 | 5.70108 | 0 | [196, 238] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231213_202601__997.json | 25.0 | missing | missing | missing | |
| 7735 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_193340__332 | 5 | 0.000374 | 2.94383 | 4 | [196, 184] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_193340__332.json | 100.0 | missing | missing | missing | |
| 7736 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_193344__758 | 0 | 0.0004505 | 3.67133 | 0 | [196, 235] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_193344__758.json | 50.0 | missing | missing | missing | |
| 7737 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_200829__225 | 0 | 0.000161 | 0.953121 | 0 | [196, 42] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_200829__225.json | 25.0 | missing | missing | missing | |
| 7738 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_200833__868 | 5 | 0.0004355 | 4.22393 | 4 | [196, 225] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_200833__868.json | 100.0 | missing | missing | missing | |
| 7739 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_194922__228 | 5 | 0.0 | 4.63997 | 4 | [196, 213] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231215_194922__228.json | 100.0 | 0.5 | missing | 0.5 | |
| 7740 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_202630__619 | 0 | 0.000579 | 6.29171 | 0 | [357, 267] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231213_202630__619.json | 50.0 | missing | missing | missing | |
| 7741 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_193418__780 | 0 | 0.0003825 | 2.38222 | 0 | [357, 136] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_193418__780.json | 0.0 | missing | missing | missing | |
| 7742 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_193423__276 | 0 | 0.000507 | 4.40449 | 0 | [357, 219] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_193423__276.json | 0.0 | missing | missing | missing | |
| 7743 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_200859__811 | 0 | 0.0005565 | 4.27628 | 0 | [357, 252] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_200859__811.json | 25.0 | missing | missing | missing | |
| 7744 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_200903__290 | 0 | 0.0005055 | 4.10313 | 0 | [357, 218] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_200903__290.json | 25.0 | missing | missing | missing | |
| 7745 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_194947__404 | 5 | 0.0 | 5.20186 | 4 | [357, 250] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231215_194947__404.json | 100.0 | 0.5 | missing | 0.5 | |
| 7746 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | true | false | 5 | 20231213_202623__803 | 0 | 0.0004855 | 4.5649 | 0 | [356, 205] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231213_202623__803.json | 25.0 | missing | missing | missing | |
| 7747 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_193410__891 | 0 | 0.000598 | 4.23044 | 0 | [356, 280] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_193410__891.json | 25.0 | missing | missing | missing | |
| 7748 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_193416__301 | 0 | 0.000736 | 6.00078 | 0 | [356, 372] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_193416__301.json | 25.0 | missing | missing | missing | |
| 7749 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_200849__963 | 0 | 0.000253 | 1.21182 | 0 | [356, 50] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_200849__963.json | 0.0 | missing | missing | missing | |
| 7750 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_200854__409 | 0 | 0.0005965 | 5.50778 | 0 | [356, 279] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_200854__409.json | 25.0 | missing | missing | missing | |
| 7751 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_194942__626 | 0 | 0.0 | 5.19857 | 4 | [356, 246] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231215_194942__626.json | 75.0 | 0.5 | missing | 0.5 | |
| 7752 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200507__598 | 5 | 0.0003735 | 1.78723 | 4 | [99, 216] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200507__598.json | 100.0 | missing | missing | missing | |
| 7753 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200509__471 | 0 | 0.000408 | 1.77546 | 0 | [99, 239] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200509__471.json | 50.0 | missing | missing | missing | |
| 7754 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200511__586 | 5 | 0.0003525 | 1.71161 | 4 | [99, 202] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200511__586.json | 100.0 | missing | missing | missing | |
| 7755 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200512__337 | 5 | 0.000297 | 1.36281 | 4 | [99, 165] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200512__337.json | 100.0 | missing | missing | missing | |
| 7756 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200514__437 | 0 | 0.000348 | 1.74753 | 0 | [99, 199] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200514__437.json | 50.0 | missing | missing | missing | |
| 7757 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200502__195 | 3 | 0.00019 | 0.975366 | 4 | [134, 82] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200502__195.json | 90.0 | missing | missing | missing | |
| 7758 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200503__885 | 5 | 0.0001795 | 0.763034 | 4 | [134, 75] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200503__885.json | 100.0 | missing | missing | missing | |
| 7759 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200504__816 | 5 | 0.000196 | 0.859241 | 4 | [134, 86] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200504__816.json | 100.0 | missing | missing | missing | |
| 7760 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200505__466 | 5 | 0.000217 | 1.06978 | 4 | [134, 100] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200505__466.json | 100.0 | missing | missing | missing | |
| 7761 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240201_200506__715 | 0 | 0.0001915 | 0.751451 | 0 | [134, 83] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200506__715.json | 25.0 | missing | missing | missing | |
| 7762 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200457__183 | 5 | 0.0002495 | 1.1019 | 4 | [196, 101] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200457__183.json | 100.0 | missing | missing | missing | |
| 7763 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200458__540 | 5 | 0.000251 | 0.901509 | 4 | [196, 102] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200458__540.json | 100.0 | missing | missing | missing | |
| 7764 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200459__278 | 5 | 0.0002555 | 1.10775 | 4 | [196, 105] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200459__278.json | 100.0 | missing | missing | missing | |
| 7765 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200500__641 | 5 | 0.0002465 | 1.06506 | 4 | [196, 99] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200500__641.json | 100.0 | missing | missing | missing | |
| 7766 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200501__364 | 5 | 0.0002075 | 0.893567 | 4 | [196, 73] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200501__364.json | 100.0 | missing | missing | missing | |
| 7767 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_200523__113 | 5 | 0.0004845 | 1.73508 | 4 | [357, 204] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200523__113.json | 100.0 | missing | missing | missing | |
| 7768 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200524__991 | 0 | 0.0002805 | 0.669118 | 0 | [357, 68] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200524__991.json | 0.0 | missing | missing | missing | |
| 7769 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200526__307 | 0 | 0.000282 | 0.672667 | 0 | [357, 69] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200526__307.json | 0.0 | missing | missing | missing | |
| 7770 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_200526__883 | 5 | 0.0004965 | 1.75054 | 4 | [357, 212] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200526__883.json | 100.0 | missing | missing | missing | |
| 7771 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200527__950 | 0 | 0.0003075 | 1.02896 | 0 | [357, 86] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200527__950.json | 0.0 | missing | missing | missing | |
| 7772 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200515__445 | 5 | 0.0003385 | 1.02551 | 4 | [356, 107] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200515__445.json | 100.0 | missing | missing | missing | |
| 7773 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200517__499 | 5 | 0.00043 | 1.36734 | 4 | [356, 168] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200517__499.json | 100.0 | missing | missing | missing | |
| 7774 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | false | 5 | 20240201_200518__520 | 0 | 0.000394 | 1.36897 | 0 | [356, 144] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200518__520.json | 25.0 | missing | missing | missing | |
| 7775 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200519__261 | 1 | 0.000418 | 1.44035 | 1 | [356, 160] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200519__261.json | 61.25 | missing | missing | missing | |
| 7776 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200521__757 | 0 | 0.000463 | 1.67787 | 0 | [356, 190] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200521__757.json | 50.0 | missing | missing | missing | |
| 7777 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 5 | 20231213_202644__621 | 0 | 0.00056 | 4.39898 | 0 | [96, 232] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231213_202644__621.json | 0.0 | missing | missing | missing | |
| 7778 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 5 | 20231225_193438__770 | 0 | 0.000536 | 3.07361 | 0 | [96, 220] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_193438__770.json | 0.0 | missing | missing | missing | |
| 7779 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 5 | 20231225_193441__344 | 0 | 0.000562 | 2.87492 | 0 | [96, 233] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_193441__344.json | 0.0 | missing | missing | missing | |
| 7780 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106--optim | AsIs | 1SHOT | false | false | 5 | 20231215_195002__259 | 0 | 0.0 | 5.89275 | 0 | [96, 231] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231215_195002__259.json | 0.0 | 0.9 | missing | 0.1 | |
| 7781 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231213_202639__646 | 5 | 0.000575 | 4.92232 | 4 | [99, 238] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231213_202639__646.json | 100.0 | missing | missing | missing | |
| 7782 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231225_193432__166 | 5 | 0.000509 | 3.12513 | 4 | [99, 205] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_193432__166.json | 100.0 | missing | missing | missing | |
| 7783 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231225_193435__579 | 0 | 0.000429 | 2.2018 | 0 | [99, 165] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_193435__579.json | 50.0 | missing | missing | missing | |
| 7784 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231227_200913__383 | 3 | 0.000523 | 2.81277 | 4 | [99, 212] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_200913__383.json | 90.0 | missing | missing | missing | |
| 7785 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231227_200916__504 | 5 | 0.000531 | 3.03434 | 4 | [99, 216] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_200916__504.json | 100.0 | missing | missing | missing | |
| 7786 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106--optim | InJulia | 1SHOT | true | true | 5 | 20231215_194955__646 | 0 | 0.0 | 4.14589 | 0 | [99, 202] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231215_194955__646.json | 50.0 | 0.9 | missing | 0.1 | |
| 7787 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_202634__471 | 5 | 0.00031 | 1.68835 | 4 | [134, 88] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231213_202634__471.json | 100.0 | missing | missing | missing | |
| 7788 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_193427__464 | 0 | 0.000286 | 1.37799 | 0 | [134, 76] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_193427__464.json | 50.0 | missing | missing | missing | |
| 7789 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_193429__850 | 0 | 0.00031 | 1.6955 | 0 | [134, 88] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_193429__850.json | 25.0 | missing | missing | missing | |
| 7790 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_200908__595 | 0 | 0.000288 | 1.40229 | 0 | [134, 77] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_200908__595.json | 25.0 | missing | missing | missing | |
| 7791 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_200910__265 | 5 | 0.000348 | 1.76194 | 4 | [134, 107] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_200910__265.json | 100.0 | missing | missing | missing | |
| 7792 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_194951__702 | 5 | 0.0 | 1.56688 | 4 | [134, 88] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231215_194951__702.json | 100.0 | 0.9 | missing | 0.1 | |
| 7793 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_202632__284 | 5 | 0.000378 | 2.74692 | 4 | [196, 91] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231213_202632__284.json | 100.0 | missing | missing | missing | |
| 7794 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_193424__612 | 5 | 0.000372 | 1.4279 | 4 | [196, 88] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_193424__612.json | 100.0 | missing | missing | missing | |
| 7795 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_193426__633 | 5 | 0.00037 | 1.67986 | 4 | [196, 87] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_193426__633.json | 100.0 | missing | missing | missing | |
| 7796 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_200904__690 | 5 | 0.00032 | 1.51036 | 4 | [196, 62] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_200904__690.json | 100.0 | missing | missing | missing | |
| 7797 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_200907__233 | 5 | 0.000424 | 2.18295 | 4 | [196, 114] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_200907__233.json | 100.0 | missing | missing | missing | |
| 7798 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_194950__938 | 5 | 0.0 | 1.95187 | 4 | [196, 91] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231215_194950__938.json | 100.0 | 0.9 | missing | 0.1 | |
| 7799 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231213_202651__346 | 0 | 0.000517 | 1.68419 | 0 | [357, 80] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231213_202651__346.json | 0.0 | missing | missing | missing | |
| 7800 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_193446__200 | 0 | 0.000487 | 1.12339 | 0 | [357, 65] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_193446__200.json | 25.0 | missing | missing | missing | |
| 7801 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_193449__651 | 0 | 0.000763 | 2.79921 | 0 | [357, 203] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_193449__651.json | 50.0 | missing | missing | missing | |
| 7802 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_200924__432 | 0 | 0.000591 | 2.13658 | 0 | [357, 117] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_200924__432.json | 0.0 | missing | missing | missing | |
| 7803 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_200929__354 | 5 | 0.000847 | 4.48591 | 4 | [357, 245] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_200929__354.json | 100.0 | missing | missing | missing | |
| 7804 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106--optim | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231215_195008__798 | 0 | 0.0 | 4.22539 | 0 | [357, 171] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231215_195008__798.json | 25.0 | 0.9 | missing | 0.1 | |
| 7805 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_202649__494 | 5 | 0.000924 | 5.45883 | 4 | [356, 284] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231213_202649__494.json | 100.0 | missing | missing | missing | |
| 7806 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_193444__726 | 5 | 0.00071 | 2.9053 | 4 | [356, 177] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_193444__726.json | 100.0 | missing | missing | missing | |
| 7807 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_193445__814 | 0 | 0.00045 | 0.87961 | 0 | [356, 47] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_193445__814.json | 0.0 | missing | missing | missing | |
| 7808 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_200918__549 | 0 | 0.000526 | 1.89479 | 0 | [356, 85] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_200918__549.json | 0.0 | missing | missing | missing | |
| 7809 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_200922__801 | 5 | 0.000866 | 4.0212 | 4 | [356, 255] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_200922__801.json | 100.0 | missing | missing | missing | |
| 7810 | Apple-MacBook-Pro-M1 | ispersonal | gpt-3.5-turbo-1106--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_195003__124 | 5 | 0.0 | 1.69614 | 4 | [356, 62] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231215_195003__124.json | 100.0 | 0.9 | missing | 0.1 | |
| 7811 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_095139__327 | 5 | 0.01398 | 33.0989 | 4 | [99, 433] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_095139__327.json | 100.0 | missing | missing | missing | |
| 7812 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_095213__373 | 0 | 0.01203 | 33.8536 | 0 | [99, 368] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_095213__373.json | 50.0 | missing | missing | missing | |
| 7813 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_095237__530 | 0 | 0.01365 | 23.9321 | 0 | [99, 422] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_095237__530.json | 50.0 | missing | missing | missing | |
| 7814 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_095316__840 | 5 | 0.0171 | 39.0725 | 4 | [99, 537] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_095316__840.json | 100.0 | missing | missing | missing | |
| 7815 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-0125-preview | InJulia | 1SHOT | true | false | 5 | 20240201_095402__626 | 0 | 0.01383 | 45.2161 | 0 | [99, 428] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_095402__626.json | 25.0 | missing | missing | missing | |
| 7816 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_094633__674 | 0 | 0.00698 | 27.2494 | 0 | [134, 188] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_094633__674.json | 50.0 | missing | missing | missing | |
| 7817 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_094644__799 | 0 | 0.00704 | 11.1407 | 0 | [134, 190] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_094644__799.json | 50.0 | missing | missing | missing | |
| 7818 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_094657__631 | 0 | 0.0065 | 12.8072 | 0 | [134, 172] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_094657__631.json | 50.0 | missing | missing | missing | |
| 7819 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_094729__519 | 0 | 0.01043 | 31.9632 | 0 | [134, 303] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_094729__519.json | 50.0 | missing | missing | missing | |
| 7820 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240201_094802__162 | 0 | 0.01169 | 33.0673 | 0 | [134, 345] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_094802__162.json | 25.0 | missing | missing | missing | |
| 7821 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_094222__166 | 0 | 0.01159 | 24.0999 | 0 | [196, 321] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_094222__166.json | 25.0 | missing | missing | missing | |
| 7822 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_094251__477 | 5 | 0.01387 | 29.081 | 4 | [196, 397] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_094251__477.json | 100.0 | missing | missing | missing | |
| 7823 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_094317__961 | 5 | 0.01216 | 25.9699 | 4 | [196, 340] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_094317__961.json | 100.0 | missing | missing | missing | |
| 7824 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_094347__516 | 0 | 0.01219 | 29.2936 | 0 | [196, 341] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_094347__516.json | 25.0 | missing | missing | missing | |
| 7825 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_094409__247 | 5 | 0.01333 | 22.261 | 4 | [196, 379] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_094409__247.json | 100.0 | missing | missing | missing | |
| 7826 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_100636__135 | 0 | 0.0186 | 41.444 | 0 | [357, 501] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_100636__135.json | 50.0 | missing | missing | missing | |
| 7827 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_100722__763 | 5 | 0.01848 | 46.2735 | 4 | [357, 497] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_100722__763.json | 100.0 | missing | missing | missing | |
| 7828 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240201_100824__299 | 0 | 0.01722 | 61.2519 | 0 | [357, 455] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_100824__299.json | 25.0 | missing | missing | missing | |
| 7829 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_100855__468 | 0 | 0.01203 | 31.6533 | 0 | [357, 282] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_100855__468.json | 50.0 | missing | missing | missing | |
| 7830 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_100953__740 | 0 | 0.02022 | 57.7833 | 0 | [357, 555] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_100953__740.json | 50.0 | missing | missing | missing | |
| 7831 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_095922__560 | 3 | 0.01382 | 39.9763 | 4 | [356, 342] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_095922__560.json | 90.0 | missing | missing | missing | |
| 7832 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_095959__267 | 5 | 0.01775 | 37.1523 | 4 | [356, 473] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_095959__267.json | 100.0 | missing | missing | missing | |
| 7833 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | false | 5 | 20240201_100035__473 | 0 | 0.01586 | 36.3474 | 0 | [356, 410] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_100035__473.json | 25.0 | missing | missing | missing | |
| 7834 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | false | 5 | 20240201_100112__218 | 0 | 0.01724 | 36.6668 | 0 | [356, 456] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_100112__218.json | 25.0 | missing | missing | missing | |
| 7835 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_100158__973 | 0 | 0.01982 | 46.4512 | 0 | [356, 542] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_100158__973.json | 50.0 | missing | missing | missing | |
| 7836 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 5 | 20231213_202832__673 | 0 | 0.01011 | 19.8167 | 0 | [96, 305] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231213_202832__673.json | 0.0 | missing | missing | missing | |
| 7837 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 5 | 20231225_193711__723 | 0 | 0.01146 | 21.3164 | 0 | [96, 350] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_193711__723.json | 0.0 | missing | missing | missing | |
| 7838 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 5 | 20231225_193728__201 | 0 | 0.01074 | 16.9759 | 0 | [96, 326] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_193728__201.json | 0.0 | missing | missing | missing | |
| 7839 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview--optim | AsIs | 1SHOT | false | false | 5 | 20231215_195143__295 | 0 | 0.0 | 21.8615 | 0 | [96, 329] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231215_195143__295.json | 0.0 | 0.1 | missing | 0.9 | |
| 7840 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231213_202812__336 | 0 | 0.01746 | 45.4439 | 0 | [99, 549] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231213_202812__336.json | 50.0 | missing | missing | missing | |
| 7841 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231225_193630__541 | 0 | 0.01335 | 20.7902 | 0 | [99, 412] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_193630__541.json | 50.0 | missing | missing | missing | |
| 7842 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231225_193650__219 | 5 | 0.01551 | 19.822 | 4 | [99, 484] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_193650__219.json | 100.0 | missing | missing | missing | |
| 7843 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231227_201058__550 | 0 | 0.01014 | 26.9022 | 0 | [99, 305] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_201058__550.json | 50.0 | missing | missing | missing | |
| 7844 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231227_201118__509 | 0 | 0.01056 | 19.4229 | 0 | [99, 319] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_201118__509.json | 50.0 | missing | missing | missing | |
| 7845 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview--optim | InJulia | 1SHOT | true | true | 5 | 20231215_195121__235 | 5 | 0.0 | 41.0505 | 4 | [99, 504] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231215_195121__235.json | 100.0 | 0.1 | missing | 0.9 | |
| 7846 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_202726__377 | 0 | 0.00869 | 17.431 | 0 | [134, 245] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231213_202726__377.json | 50.0 | missing | missing | missing | |
| 7847 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_193558__703 | 5 | 0.01136 | 12.2067 | 4 | [134, 334] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_193558__703.json | 100.0 | missing | missing | missing | |
| 7848 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_193609__523 | 0 | 0.00986 | 10.9226 | 0 | [134, 284] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_193609__523.json | 50.0 | missing | missing | missing | |
| 7849 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_201019__602 | 0 | 0.00833 | 16.73 | 0 | [134, 233] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_201019__602.json | 50.0 | missing | missing | missing | |
| 7850 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_201031__323 | 0 | 0.00593 | 12.0249 | 0 | [134, 153] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_201031__323.json | 50.0 | missing | missing | missing | |
| 7851 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_195040__958 | 0 | 0.0 | 25.2512 | 0 | [134, 245] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231215_195040__958.json | 50.0 | 0.1 | missing | 0.9 | |
| 7852 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_202709__811 | 5 | 0.00874 | 17.4847 | 4 | [196, 226] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231213_202709__811.json | 100.0 | missing | missing | missing | |
| 7853 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_193532__308 | 0 | 0.01297 | 42.7906 | 0 | [196, 367] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_193532__308.json | 25.0 | missing | missing | missing | |
| 7854 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_193546__982 | 5 | 0.00883 | 13.7234 | 4 | [196, 229] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_193546__982.json | 100.0 | missing | missing | missing | |
| 7855 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_200948__159 | 5 | 0.01192 | 19.6308 | 4 | [196, 332] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_200948__159.json | 100.0 | missing | missing | missing | |
| 7856 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_201002__153 | 5 | 0.0085 | 13.9072 | 4 | [196, 218] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_201002__153.json | 100.0 | missing | missing | missing | |
| 7857 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_195015__826 | 5 | 0.0 | 6.79659 | 4 | [196, 100] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231215_195015__826.json | 100.0 | 0.1 | missing | 0.9 | |
| 7858 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_202941__254 | 0 | 0.0192 | 41.6896 | 0 | [357, 521] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231213_202941__254.json | 50.0 | missing | missing | missing | |
| 7859 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_193814__317 | 0 | 0.01722 | 22.0045 | 0 | [357, 455] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_193814__317.json | 25.0 | missing | missing | missing | |
| 7860 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_193829__815 | 0 | 0.01431 | 15.2044 | 0 | [357, 358] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_193829__815.json | 50.0 | missing | missing | missing | |
| 7861 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_201241__563 | 0 | 0.02109 | 43.2682 | 0 | [357, 584] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_201241__563.json | 0.0 | missing | missing | missing | |
| 7862 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_201320__672 | 0 | 0.01575 | 38.8022 | 0 | [357, 406] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_201320__672.json | 50.0 | missing | missing | missing | |
| 7863 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_195234__143 | 0 | 0.0 | 29.4875 | 0 | [357, 389] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231215_195234__143.json | 50.0 | 0.1 | missing | 0.9 | |
| 7864 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_202859__491 | 0 | 0.01547 | 27.4815 | 0 | [356, 397] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231213_202859__491.json | 50.0 | missing | missing | missing | |
| 7865 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_193743__439 | 0 | 0.01871 | 14.9342 | 0 | [356, 505] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_193743__439.json | 25.0 | missing | missing | missing | |
| 7866 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_193752__632 | 0 | 0.0122 | 8.67039 | 0 | [356, 288] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_193752__632.json | 50.0 | missing | missing | missing | |
| 7867 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_201137__925 | 0 | 0.0143 | 18.8516 | 0 | [356, 358] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_201137__925.json | 50.0 | missing | missing | missing | |
| 7868 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_201158__135 | 0 | 0.01388 | 21.2367 | 0 | [356, 344] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_201158__135.json | 25.0 | missing | missing | missing | |
| 7869 | Apple-MacBook-Pro-M1 | ispersonal | gpt-4-1106-preview--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_195205__907 | 5 | 0.0 | 21.8469 | 4 | [356, 297] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231215_195205__907.json | 100.0 | 0.1 | missing | 0.9 | |
| 7870 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | AsIs | 1SHOT | false | false | 5 | 20231214_073331__381 | 0 | 0.0 | 18.8423 | 0 | [96, 543] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__AsIs__1SHOT__20231214_073331__381.json | 0.0 | missing | missing | missing | |
| 7871 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | AsIs | 1SHOT | false | false | 5 | 20231225_150616__265 | 0 | 0.0 | 9.82613 | 0 | [96, 289] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__AsIs__1SHOT__20231225_150616__265.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7872 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | AsIs | 1SHOT | false | false | 5 | 20231225_150627__693 | 0 | 0.0 | 11.4289 | 0 | [1, 354] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__AsIs__1SHOT__20231225_150627__693.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7873 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | InJulia | 1SHOT | true | false | 5 | 20231214_073313__114 | 0 | 0.0 | 11.4733 | 0 | [113, 330] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__InJulia__1SHOT__20231214_073313__114.json | 25.0 | missing | missing | missing | |
| 7874 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | InJulia | 1SHOT | false | false | 5 | 20231225_150549__320 | 0 | 0.0 | 16.4283 | 0 | [113, 473] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__InJulia__1SHOT__20231225_150549__320.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7875 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | InJulia | 1SHOT | true | false | 5 | 20231225_150606__884 | 0 | 0.0 | 16.4942 | 0 | [1, 498] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__InJulia__1SHOT__20231225_150606__884.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7876 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | InJulia | 1SHOT | false | false | 5 | 20231227_004408__610 | 0 | 0.0 | 14.19 | 0 | [113, 415] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__InJulia__1SHOT__20231227_004408__610.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7877 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_073301__291 | 0 | 0.0 | 10.3625 | 0 | [142, 286] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaExpertAsk__1SHOT__20231214_073301__291.json | 25.0 | missing | missing | missing | |
| 7878 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_150519__541 | 0 | 0.0 | 13.0245 | 0 | [142, 365] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_150519__541.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7879 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_150533__433 | 0 | 0.0 | 14.0356 | 0 | [1, 423] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_150533__433.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7880 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_004354__842 | 0 | 0.0 | 9.57441 | 0 | [142, 268] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaExpertAsk__1SHOT__20231227_004354__842.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7881 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_073251__832 | 0 | 0.0 | 17.3102 | 0 | [217, 455] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_073251__832.json | 25.0 | missing | missing | missing | |
| 7882 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_150457__537 | 0 | 0.0 | 12.4388 | 0 | [235, 177] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_150457__537.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7883 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_150506__393 | 0 | 0.0 | 8.8189 | 0 | [1, 266] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_150506__393.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7884 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_004344__597 | 0 | 0.0 | 13.3088 | 0 | [235, 207] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_004344__597.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7885 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_073431__607 | 0 | 0.0 | 26.7839 | 0 | [11, 699] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_073431__607.json | 0.0 | missing | missing | missing | |
| 7886 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_150743__320 | 0 | 0.0 | 25.922 | 0 | [11, 684] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_150743__320.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7887 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_150803__462 | 0 | 0.0 | 19.9087 | 0 | [1, 541] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_150803__462.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7888 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_004500__231 | 0 | 0.0 | 19.1455 | 0 | [11, 521] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_004500__231.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7889 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_073404__406 | 0 | 0.0 | 32.971 | 0 | [413, 761] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaRecapTask__1SHOT__20231214_073404__406.json | 25.0 | missing | missing | missing | |
| 7890 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_150654__703 | 0 | 0.0 | 26.3314 | 0 | [413, 605] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_150654__703.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7891 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_150717__523 | 0 | 0.0 | 23.1621 | 0 | [1, 622] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_150717__523.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7892 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_004440__219 | 0 | 0.0 | 32.5411 | 0 | [413, 761] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaRecapTask__1SHOT__20231227_004440__219.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7893 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | AsIs | 1SHOT | false | false | 5 | 20231214_074436__672 | 0 | 0.0 | 19.9852 | 0 | [96, 574] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__AsIs__1SHOT__20231214_074436__672.json | 0.0 | missing | missing | missing | |
| 7894 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | AsIs | 1SHOT | false | false | 5 | 20231225_152655__757 | 0 | 0.0 | 10.0319 | 0 | [110, 324] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__AsIs__1SHOT__20231225_152655__757.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7895 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | AsIs | 1SHOT | false | false | 5 | 20231225_152710__184 | 0 | 0.0 | 15.1046 | 0 | [110, 491] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__AsIs__1SHOT__20231225_152710__184.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7896 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | InJulia | 1SHOT | true | false | 5 | 20231214_074416__317 | 0 | 0.0 | 12.3713 | 0 | [113, 356] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__InJulia__1SHOT__20231214_074416__317.json | 25.0 | missing | missing | missing | |
| 7897 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_152636__424 | 4 | 0.0 | 10.3217 | 3 | [113, 334] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__InJulia__1SHOT__20231225_152636__424.json | 88.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 7898 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_152645__271 | 0 | 0.0 | 8.4865 | 0 | [113, 271] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__InJulia__1SHOT__20231225_152645__271.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7899 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | InJulia | 1SHOT | true | true | 5 | 20231227_005359__990 | 0 | 0.0 | 10.6798 | 0 | [113, 343] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__InJulia__1SHOT__20231227_005359__990.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7900 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_074404__875 | 0 | 0.0 | 9.81742 | 0 | [142, 270] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231214_074404__875.json | 25.0 | missing | missing | missing | |
| 7901 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_152615__880 | 5 | 0.0 | 7.88556 | 4 | [152, 246] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_152615__880.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7902 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_152626__763 | 4 | 0.0 | 10.1604 | 3 | [152, 321] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_152626__763.json | 88.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 7903 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_005348__532 | 5 | 0.0 | 7.71003 | 4 | [152, 239] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231227_005348__532.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7904 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_074354__514 | 0 | 0.0 | 15.7895 | 0 | [217, 414] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231214_074354__514.json | 25.0 | missing | missing | missing | |
| 7905 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_152600__946 | 5 | 0.0 | 13.0923 | 4 | [227, 204] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_152600__946.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7906 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_152607__712 | 5 | 0.0 | 6.90933 | 4 | [227, 197] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_152607__712.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7907 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_005340__842 | 5 | 0.0 | 18.9434 | 4 | [227, 400] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231227_005340__842.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7908 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_074540__517 | 0 | 0.0 | 31.8717 | 0 | [11, 820] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231214_074540__517.json | 0.0 | missing | missing | missing | |
| 7909 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_152743__743 | 0 | 0.0 | 10.723 | 0 | [416, 292] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_152743__743.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7910 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_152755__612 | 0 | 0.0 | 12.0131 | 0 | [416, 333] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_152755__612.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7911 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_005420__902 | 0 | 0.0 | 10.4265 | 0 | [416, 280] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231227_005420__902.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7912 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_074508__933 | 0 | 0.0 | 31.4976 | 0 | [413, 726] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaRecapTask__1SHOT__20231214_074508__933.json | 0.0 | missing | missing | missing | |
| 7913 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_152723__502 | 0 | 0.0 | 13.2438 | 0 | [413, 371] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_152723__502.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7914 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_152732__734 | 0 | 0.0 | 9.02177 | 0 | [413, 238] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_152732__734.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7915 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_005410__240 | 3 | 0.0 | 10.4878 | 4 | [413, 282] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaRecapTask__1SHOT__20231227_005410__240.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7916 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_181408__465 | 0 | 0.0 | 15.5665 | 0 | [113, 297] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_181408__465.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7917 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_181421__824 | 0 | 0.0 | 13.3397 | 0 | [113, 253] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_181421__824.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7918 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_181439__659 | 5 | 0.0 | 18.1706 | 4 | [113, 348] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_181439__659.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7919 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_181312__139 | 4 | 0.0 | 17.5705 | 3 | [152, 332] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_181312__139.json | 88.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 7920 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_181330__800 | 4 | 0.0 | 18.4887 | 3 | [152, 350] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_181330__800.json | 88.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 7921 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_181352__784 | 0 | 0.0 | 22.1591 | 0 | [152, 421] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_181352__784.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7922 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_181226__451 | 0 | 0.0 | 18.5524 | 0 | [227, 339] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_181226__451.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7923 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_181243__621 | 0 | 0.0 | 17.0947 | 0 | [227, 311] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_181243__621.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7924 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_181254__106 | 1 | 0.0 | 11.2522 | 1 | [227, 197] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_181254__106.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 7925 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_181544__599 | 0 | 0.0 | 15.3087 | 0 | [416, 255] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_181544__599.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7926 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_181608__727 | 0 | 0.0 | 23.2654 | 0 | [416, 398] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_181608__727.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7927 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_181629__525 | 0 | 0.0 | 21.0057 | 0 | [416, 341] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_181629__525.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7928 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_181457__246 | 5 | 0.0 | 17.7422 | 4 | [413, 301] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_181457__246.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7929 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_181514__427 | 0 | 0.0 | 16.1417 | 0 | [413, 270] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_181514__427.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7930 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_181529__469 | 5 | 0.0 | 14.7465 | 4 | [413, 244] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_181529__469.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7931 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | AsIs | 1SHOT | false | false | 5 | 20231213_203142__594 | 0 | 0.00720046 | 19.1755 | 0 | [108, 854] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__AsIs__1SHOT__20231213_203142__594.json | 0.0 | missing | missing | missing | |
| 7932 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | AsIs | 1SHOT | false | false | 5 | 20231225_194213__802 | 0 | 0.00389974 | 15.1707 | 0 | [108, 446] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__AsIs__1SHOT__20231225_194213__802.json | 0.0 | missing | missing | missing | |
| 7933 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | AsIs | 1SHOT | false | false | 5 | 20231225_194229__492 | 0 | 0.0045793 | 15.9761 | 0 | [108, 530] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__AsIs__1SHOT__20231225_194229__492.json | 0.0 | missing | missing | missing | |
| 7934 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium--optim | AsIs | 1SHOT | false | false | 5 | 20231215_195445__406 | 0 | 0.0 | 26.1147 | 0 | [108, 431] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__AsIs__1SHOT__20231215_195445__406.json | 0.0 | 0.9 | missing | 0.3 | |
| 7935 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | InJulia | 1SHOT | true | false | 5 | 20231213_203123__643 | 0 | 0.00404537 | 10.3384 | 0 | [111, 463] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__InJulia__1SHOT__20231213_203123__643.json | 25.0 | missing | missing | missing | |
| 7936 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | InJulia | 1SHOT | true | false | 5 | 20231225_194146__655 | 0 | 0.00254872 | 14.8184 | 0 | [111, 278] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__InJulia__1SHOT__20231225_194146__655.json | 25.0 | missing | missing | missing | |
| 7937 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | InJulia | 1SHOT | true | false | 5 | 20231225_194158__594 | 0 | 0.00449841 | 11.602 | 0 | [111, 519] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__InJulia__1SHOT__20231225_194158__594.json | 25.0 | missing | missing | missing | |
| 7938 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231227_201621__562 | 0 | 0.00436088 | 29.725 | 0 | [111, 502] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__InJulia__1SHOT__20231227_201621__562.json | 50.0 | missing | missing | missing | |
| 7939 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | InJulia | 1SHOT | true | false | 5 | 20231227_201644__441 | 0 | 0.00350334 | 22.5777 | 0 | [111, 396] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__InJulia__1SHOT__20231227_201644__441.json | 25.0 | missing | missing | missing | |
| 7940 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium--optim | InJulia | 1SHOT | true | false | 5 | 20231215_195419__279 | 0 | 0.0 | 30.3062 | 0 | [111, 468] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__InJulia__1SHOT__20231215_195419__279.json | 25.0 | 0.9 | missing | 0.3 | |
| 7941 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231213_203112__887 | 0 | 0.0042073 | 13.9558 | 0 | [150, 470] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231213_203112__887.json | 25.0 | missing | missing | missing | |
| 7942 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_194121__245 | 0 | 0.00372999 | 9.29498 | 0 | [150, 411] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_194121__245.json | 25.0 | missing | missing | missing | |
| 7943 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_194131__805 | 0 | 0.00395651 | 9.87969 | 0 | [150, 439] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_194131__805.json | 50.0 | missing | missing | missing | |
| 7944 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_201529__929 | 0 | 0.00317987 | 13.5293 | 0 | [150, 343] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_201529__929.json | 50.0 | missing | missing | missing | |
| 7945 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_201551__242 | 0 | 0.00408595 | 21.8197 | 0 | [150, 455] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_201551__242.json | 25.0 | missing | missing | missing | |
| 7946 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium--optim | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231215_195349__192 | 0 | 0.0 | 13.6205 | 0 | [150, 613] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231215_195349__192.json | 25.0 | 0.9 | missing | 0.3 | |
| 7947 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_203058__711 | 5 | 0.00466868 | 11.3494 | 4 | [225, 502] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231213_203058__711.json | 100.0 | missing | missing | missing | |
| 7948 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_194057__711 | 0 | 0.00372215 | 28.9212 | 0 | [225, 385] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_194057__711.json | 25.0 | missing | missing | missing | |
| 7949 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_194111__801 | 5 | 0.00393249 | 13.9774 | 4 | [225, 411] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_194111__801.json | 100.0 | missing | missing | missing | |
| 7950 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_201506__445 | 0 | 0.0027918 | 6.16077 | 0 | [225, 270] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_201506__445.json | 25.0 | missing | missing | missing | |
| 7951 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_201516__213 | 5 | 0.00393249 | 9.23311 | 4 | [225, 411] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_201516__213.json | 100.0 | missing | missing | missing | |
| 7952 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_195335__999 | 5 | 0.0 | 10.2372 | 4 | [225, 461] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231215_195335__999.json | 100.0 | 0.9 | missing | 0.3 | |
| 7953 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231213_203229__127 | 0 | 0.00551606 | 38.1835 | 0 | [413, 544] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231213_203229__127.json | 25.0 | missing | missing | missing | |
| 7954 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_194312__220 | 0 | 0.00630079 | 14.721 | 0 | [413, 641] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_194312__220.json | 25.0 | missing | missing | missing | |
| 7955 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_194324__126 | 0 | 0.00507111 | 11.2288 | 0 | [413, 489] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_194324__126.json | 25.0 | missing | missing | missing | |
| 7956 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_201734__199 | 0 | 0.00573449 | 27.4663 | 0 | [413, 571] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_201734__199.json | 25.0 | missing | missing | missing | |
| 7957 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_201759__868 | 0 | 0.0089624 | 24.7619 | 0 | [413, 970] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_201759__868.json | 0.0 | missing | missing | missing | |
| 7958 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium--optim | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231215_195534__475 | 0 | 0.0 | 18.5676 | 0 | [413, 476] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231215_195534__475.json | 25.0 | 0.9 | missing | 0.3 | |
| 7959 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | JuliaRecapTask | 1SHOT | true | false | 5 | 20231213_203150__945 | 0 | 0.00417311 | 8.65445 | 0 | [410, 379] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231213_203150__945.json | 25.0 | missing | missing | missing | |
| 7960 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_194245__190 | 0 | 0.00679427 | 16.1421 | 0 | [410, 703] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_194245__190.json | 25.0 | missing | missing | missing | |
| 7961 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_194258__216 | 0 | 0.00554032 | 12.6724 | 0 | [410, 548] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_194258__216.json | 25.0 | missing | missing | missing | |
| 7962 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_201649__253 | 0 | 0.001916 | 5.70469 | 0 | [410, 100] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_201649__253.json | 0.0 | missing | missing | missing | |
| 7963 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_201707__559 | 0 | 0.00367153 | 17.2412 | 0 | [410, 317] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_201707__559.json | 25.0 | missing | missing | missing | |
| 7964 | Apple-MacBook-Pro-M1 | ispersonal | mistral-medium--optim | JuliaRecapTask | 1SHOT | true | false | 5 | 20231215_195515__570 | 0 | 0.0 | 29.8765 | 0 | [410, 429] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231215_195515__570.json | 25.0 | 0.9 | missing | 0.3 | |
| 7965 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | AsIs | 1SHOT | true | true | 5 | 20231213_203034__317 | 5 | 0.000632474 | 3.96824 | 4 | [102, 292] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__AsIs__1SHOT__20231213_203034__317.json | 100.0 | missing | missing | missing | |
| 7966 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | AsIs | 1SHOT | false | false | 5 | 20231225_193946__355 | 0 | 0.000684854 | 4.4072 | 0 | [102, 319] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__AsIs__1SHOT__20231225_193946__355.json | 0.0 | missing | missing | missing | |
| 7967 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | AsIs | 1SHOT | false | false | 5 | 20231225_193952__237 | 0 | 0.000981674 | 6.94482 | 0 | [102, 472] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__AsIs__1SHOT__20231225_193952__237.json | 0.0 | missing | missing | missing | |
| 7968 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small--optim | AsIs | 1SHOT | false | false | 5 | 20231215_195312__508 | 0 | 0.0 | 5.74442 | 0 | [102, 426] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__AsIs__1SHOT__20231215_195312__508.json | 0.0 | 0.9 | missing | 0.3 | |
| 7969 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231213_203030__472 | 0 | 0.000618895 | 3.85739 | 0 | [105, 284] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__InJulia__1SHOT__20231213_203030__472.json | 50.0 | missing | missing | missing | |
| 7970 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231225_193936__363 | 0 | 0.000764395 | 4.98079 | 0 | [105, 359] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__InJulia__1SHOT__20231225_193936__363.json | 50.0 | missing | missing | missing | |
| 7971 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231225_193941__881 | 0 | 0.000754695 | 4.80908 | 0 | [105, 354] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__InJulia__1SHOT__20231225_193941__881.json | 50.0 | missing | missing | missing | |
| 7972 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231227_201421__361 | 0 | 0.000630535 | 4.02538 | 0 | [105, 290] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__InJulia__1SHOT__20231227_201421__361.json | 50.0 | missing | missing | missing | |
| 7973 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231227_201425__919 | 0 | 0.000735295 | 4.69846 | 0 | [105, 344] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__InJulia__1SHOT__20231227_201425__919.json | 50.0 | missing | missing | missing | |
| 7974 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small--optim | InJulia | 1SHOT | true | true | 5 | 20231215_195306__503 | 0 | 0.0 | 4.60242 | 0 | [105, 339] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__InJulia__1SHOT__20231215_195306__503.json | 50.0 | 0.9 | missing | 0.3 | |
| 7975 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_203026__443 | 5 | 0.000659002 | 3.92693 | 4 | [146, 291] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231213_203026__443.json | 100.0 | missing | missing | missing | |
| 7976 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_193928__600 | 0 | 0.000686162 | 4.19424 | 0 | [146, 305] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_193928__600.json | 25.0 | missing | missing | missing | |
| 7977 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_193931__451 | 5 | 0.000513502 | 3.0759 | 4 | [146, 216] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_193931__451.json | 100.0 | missing | missing | missing | |
| 7978 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_201414__334 | 0 | 0.000626022 | 3.74215 | 0 | [146, 274] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_201414__334.json | 25.0 | missing | missing | missing | |
| 7979 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_201417__201 | 0 | 0.000490222 | 2.92379 | 0 | [146, 204] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_201417__201.json | 25.0 | missing | missing | missing | |
| 7980 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small--optim | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231215_195302__428 | 0 | 0.0 | 3.14009 | 0 | [146, 235] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231215_195302__428.json | 25.0 | 0.9 | missing | 0.3 | |
| 7981 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231213_203022__723 | 0 | 0.00061182 | 3.43 | 0 | [220, 242] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231213_203022__723.json | 25.0 | missing | missing | missing | |
| 7982 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_193918__496 | 5 | 0.00092222 | 5.46176 | 4 | [220, 402] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_193918__496.json | 100.0 | missing | missing | missing | |
| 7983 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_193923__196 | 0 | 0.0009746 | 5.85415 | 0 | [220, 429] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_193923__196.json | 25.0 | missing | missing | missing | |
| 7984 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_201405__667 | 5 | 0.0008679 | 5.07427 | 4 | [220, 374] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_201405__667.json | 100.0 | missing | missing | missing | |
| 7985 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_201410__215 | 5 | 0.00082134 | 4.78383 | 4 | [220, 350] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_201410__215.json | 100.0 | missing | missing | missing | |
| 7986 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small--optim | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231215_195259__309 | 0 | 0.0 | 4.90801 | 0 | [220, 373] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231215_195259__309.json | 25.0 | 0.9 | missing | 0.3 | |
| 7987 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231213_203046__305 | 0 | 0.00107554 | 5.83679 | 0 | [412, 417] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231213_203046__305.json | 25.0 | missing | missing | missing | |
| 7988 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_194021__204 | 0 | 0.00123656 | 6.85623 | 0 | [412, 500] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_194021__204.json | 25.0 | missing | missing | missing | |
| 7989 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_194028__217 | 0 | 0.00130834 | 7.39487 | 0 | [412, 537] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_194028__217.json | 25.0 | missing | missing | missing | |
| 7990 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_201446__250 | 0 | 0.00175842 | 10.4549 | 0 | [412, 769] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_201446__250.json | 25.0 | missing | missing | missing | |
| 7991 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_201500__250 | 0 | 0.00140922 | 13.9854 | 0 | [412, 589] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_201500__250.json | 50.0 | missing | missing | missing | |
| 7992 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small--optim | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231215_195325__421 | 0 | 0.0 | 5.52502 | 0 | [412, 413] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231215_195325__421.json | 25.0 | 0.9 | missing | 0.3 | |
| 7993 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | JuliaRecapTask | 1SHOT | true | false | 5 | 20231213_203040__850 | 0 | 0.00114991 | 6.27034 | 0 | [410, 456] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231213_203040__850.json | 25.0 | missing | missing | missing | |
| 7994 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_194008__778 | 0 | 0.00120423 | 15.8569 | 0 | [410, 484] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_194008__778.json | 50.0 | missing | missing | missing | |
| 7995 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_194014__955 | 0 | 0.00098113 | 5.08123 | 0 | [410, 369] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_194014__955.json | 50.0 | missing | missing | missing | |
| 7996 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_201432__700 | 0 | 0.00107619 | 6.08317 | 0 | [410, 418] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_201432__700.json | 25.0 | missing | missing | missing | |
| 7997 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_201436__418 | 0 | 0.00080265 | 3.9854 | 0 | [410, 277] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_201436__418.json | 25.0 | missing | missing | missing | |
| 7998 | Apple-MacBook-Pro-M1 | ispersonal | mistral-small--optim | JuliaRecapTask | 1SHOT | true | false | 5 | 20231215_195319__518 | 0 | 0.0 | 6.62992 | 0 | [410, 493] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231215_195319__518.json | 25.0 | 0.9 | missing | 0.3 | |
| 7999 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231213_203006__650 | 0 | 0.000160146 | 5.5804 | 0 | [102, 322] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__AsIs__1SHOT__20231213_203006__650.json | 0.0 | missing | missing | missing | |
| 8000 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231225_193853__909 | 0 | 0.000158787 | 2.83687 | 0 | [102, 319] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__AsIs__1SHOT__20231225_193853__909.json | 0.0 | missing | missing | missing | |
| 8001 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231225_193857__886 | 0 | 0.000216318 | 3.91087 | 0 | [102, 446] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__AsIs__1SHOT__20231225_193857__886.json | 0.0 | missing | missing | missing | |
| 8002 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny--optim | AsIs | 1SHOT | false | false | 5 | 20231215_195247__415 | 0 | 0.0 | 2.57954 | 0 | [102, 296] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__AsIs__1SHOT__20231215_195247__415.json | 0.0 | 0.9 | missing | 0.3 | |
| 8003 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | InJulia | 1SHOT | false | false | 5 | 20231213_203000__173 | 0 | 0.000180951 | 6.21562 | 0 | [105, 367] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__InJulia__1SHOT__20231213_203000__173.json | 0.0 | missing | missing | missing | |
| 8004 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | InJulia | 1SHOT | true | false | 5 | 20231225_193846__640 | 0 | 0.000187293 | 3.36916 | 0 | [105, 381] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__InJulia__1SHOT__20231225_193846__640.json | 25.0 | missing | missing | missing | |
| 8005 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | InJulia | 1SHOT | false | false | 5 | 20231225_193850__776 | 0 | 0.000154677 | 3.44976 | 0 | [105, 309] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__InJulia__1SHOT__20231225_193850__776.json | 0.0 | missing | missing | missing | |
| 8006 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231227_201341__341 | 2 | 0.000207678 | 3.65381 | 2 | [105, 426] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__InJulia__1SHOT__20231227_201341__341.json | 72.5 | missing | missing | missing | |
| 8007 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | InJulia | 1SHOT | true | false | 5 | 20231227_201345__990 | 0 | 0.000197712 | 3.49987 | 0 | [105, 404] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__InJulia__1SHOT__20231227_201345__990.json | 25.0 | missing | missing | missing | |
| 8008 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny--optim | InJulia | 1SHOT | true | true | 5 | 20231215_195244__540 | 5 | 0.0 | 2.7387 | 4 | [105, 322] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__InJulia__1SHOT__20231215_195244__540.json | 100.0 | 0.9 | missing | 0.3 | |
| 8009 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231213_202954__844 | 0 | 0.000113758 | 3.94925 | 0 | [146, 206] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231213_202954__844.json | 25.0 | missing | missing | missing | |
| 8010 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_193842__330 | 0 | 0.000122818 | 2.04242 | 0 | [146, 226] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_193842__330.json | 0.0 | missing | missing | missing | |
| 8011 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_193843__566 | 0 | 9.0202e-5 | 1.50077 | 0 | [146, 154] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_193843__566.json | 0.0 | missing | missing | missing | |
| 8012 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_201335__177 | 0 | 9.6997e-5 | 1.67662 | 0 | [146, 169] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_201335__177.json | 25.0 | missing | missing | missing | |
| 8013 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_201337__897 | 3 | 0.000104698 | 1.80683 | 4 | [146, 186] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_201337__897.json | 90.0 | missing | missing | missing | |
| 8014 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_195241__547 | 3 | 0.0 | 1.64116 | 4 | [146, 184] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231215_195241__547.json | 90.0 | 0.9 | missing | 0.3 | |
| 8015 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_202950__888 | 0 | 0.000235103 | 8.79819 | 4 | [220, 451] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231213_202950__888.json | 75.0 | missing | missing | missing | |
| 8016 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_193836__729 | 0 | 0.000100109 | 6.5357 | 0 | [220, 153] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_193836__729.json | 0.0 | missing | missing | missing | |
| 8017 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_193839__870 | 5 | 0.000163076 | 3.08497 | 4 | [220, 292] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_193839__870.json | 100.0 | missing | missing | missing | |
| 8018 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_201329__358 | 0 | 0.00015764 | 8.65949 | 0 | [220, 280] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_201329__358.json | 25.0 | missing | missing | missing | |
| 8019 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_201334__251 | 0 | 0.000272702 | 4.63688 | 0 | [220, 534] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_201334__251.json | 0.0 | missing | missing | missing | |
| 8020 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_195239__871 | 5 | 0.0 | 4.65411 | 4 | [220, 298] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231215_195239__871.json | 100.0 | 0.9 | missing | 0.3 | |
| 8021 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_203019__486 | 5 | 0.000182255 | 4.71768 | 4 | [412, 275] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231213_203019__486.json | 100.0 | missing | missing | missing | |
| 8022 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_193908__887 | 0 | 0.000240239 | 3.73771 | 0 | [412, 403] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_193908__887.json | 0.0 | missing | missing | missing | |
| 8023 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_193912__428 | 0 | 0.000202187 | 3.54773 | 0 | [412, 319] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_193912__428.json | 0.0 | missing | missing | missing | |
| 8024 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_201357__465 | 5 | 0.000228914 | 3.36754 | 4 | [412, 378] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_201357__465.json | 100.0 | missing | missing | missing | |
| 8025 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_201400__553 | 0 | 0.000203546 | 2.91696 | 0 | [412, 322] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_201400__553.json | 25.0 | missing | missing | missing | |
| 8026 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny--optim | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231215_195254__508 | 0 | 0.0 | 3.39278 | 0 | [412, 398] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231215_195254__508.json | 0.0 | 0.9 | missing | 0.3 | |
| 8027 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_203014__188 | 0 | 0.000190129 | 7.85798 | 0 | [410, 293] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231213_203014__188.json | 0.0 | missing | missing | missing | |
| 8028 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_193900__503 | 0 | 0.000218668 | 3.19843 | 0 | [410, 356] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_193900__503.json | 0.0 | missing | missing | missing | |
| 8029 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_193905__493 | 0 | 0.000267139 | 4.71254 | 0 | [410, 463] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_193905__493.json | 0.0 | missing | missing | missing | |
| 8030 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_201349__402 | 0 | 0.000260344 | 3.99832 | 0 | [410, 448] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_201349__402.json | 25.0 | missing | missing | missing | |
| 8031 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_201354__944 | 0 | 0.000308362 | 4.89914 | 0 | [410, 554] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_201354__944.json | 25.0 | missing | missing | missing | |
| 8032 | Apple-MacBook-Pro-M1 | ispersonal | mistral-tiny--optim | JuliaRecapTask | 1SHOT | true | false | 5 | 20231215_195250__394 | 0 | 0.0 | 3.46642 | 0 | [410, 401] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231215_195250__394.json | 25.0 | 0.9 | missing | 0.3 | |
| 8033 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_155755__723 | 0 | 0.0 | 8.97573 | 0 | [101, 218] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_155755__723.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8034 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_155802__475 | 0 | 0.0 | 6.58196 | 0 | [101, 156] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_155802__475.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8035 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231225_155741__939 | 0 | 0.0 | 5.10294 | 0 | [104, 117] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_155741__939.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8036 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231225_155746__950 | 0 | 0.0 | 5.67622 | 0 | [104, 132] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_155746__950.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8037 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_010952__724 | 0 | 0.0 | 5.04747 | 0 | [104, 115] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_010952__724.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8038 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_155729__364 | 0 | 0.0 | 8.61374 | 0 | [145, 204] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_155729__364.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8039 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_155736__503 | 0 | 0.0 | 6.33145 | 0 | [145, 144] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_155736__503.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8040 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_010947__663 | 0 | 0.0 | 6.10335 | 0 | [145, 138] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_010947__663.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8041 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_155716__324 | 0 | 0.0 | 24.252 | 0 | [219, 449] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_155716__324.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8042 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_155721__340 | 0 | 0.0 | 4.00173 | 0 | [219, 74] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_155721__340.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8043 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_010941__277 | 0 | 0.0 | 8.2067 | 0 | [219, 44] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_010941__277.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8044 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_155855__227 | 0 | 0.0 | 18.7205 | 0 | [412, 415] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_155855__227.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8045 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_155914__851 | 0 | 0.0 | 19.3838 | 0 | [412, 431] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_155914__851.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8046 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_011027__258 | 0 | 0.0 | 11.2751 | 0 | [412, 230] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_011027__258.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8047 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_155816__699 | 0 | 0.0 | 14.1113 | 0 | [410, 302] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_155816__699.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8048 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_155836__365 | 0 | 0.0 | 19.755 | 0 | [410, 441] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_155836__365.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8049 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_011015__403 | 0 | 0.0 | 23.6343 | 0 | [410, 532] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_011015__403.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8050 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | false | 5 | 20231227_235109__901 | 0 | 0.0 | 14.4449 | 0 | [103, 452] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_235109__901.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8051 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | false | false | 5 | 20231227_235120__768 | 0 | 0.0 | 10.3994 | 0 | [103, 323] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_235120__768.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8052 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | false | false | 5 | 20231227_235130__616 | 0 | 0.0 | 10.3611 | 0 | [103, 322] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_235130__616.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8053 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | false | 5 | 20231227_235148__534 | 0 | 0.0 | 18.2084 | 0 | [103, 570] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_235148__534.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8054 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | false | false | 5 | 20231227_235158__120 | 0 | 0.0 | 9.51657 | 0 | [103, 294] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_235158__120.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8055 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_235028__155 | 0 | 0.0 | 6.22616 | 0 | [144, 182] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_235028__155.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8056 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_235034__732 | 0 | 0.0 | 5.30238 | 0 | [144, 151] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_235034__732.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8057 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_235039__494 | 0 | 0.0 | 5.4409 | 0 | [144, 156] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_235039__494.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8058 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_235046__692 | 0 | 0.0 | 6.78885 | 0 | [144, 200] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_235046__692.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8059 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_235055__575 | 0 | 0.0 | 9.11697 | 0 | [144, 274] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_235055__575.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8060 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_234947__504 | 0 | 0.0 | 10.092 | 0 | [218, 268] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_234947__504.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8061 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_234952__395 | 5 | 0.0 | 5.03961 | 4 | [218, 132] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_234952__395.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8062 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_235005__809 | 0 | 0.0 | 12.6276 | 0 | [218, 374] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_235005__809.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8063 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_235012__727 | 0 | 0.0 | 7.20025 | 0 | [218, 202] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_235012__727.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8064 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_235022__437 | 0 | 0.0 | 10.257 | 0 | [218, 300] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_235022__437.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8065 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_235332__293 | 0 | 0.0 | 16.1225 | 0 | [411, 445] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_235332__293.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8066 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_235343__570 | 0 | 0.0 | 10.8657 | 0 | [411, 283] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_235343__570.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8067 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_235358__234 | 0 | 0.0 | 14.9814 | 0 | [411, 410] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_235358__234.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8068 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_235414__498 | 0 | 0.0 | 16.095 | 0 | [411, 444] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_235414__498.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8069 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_235429__777 | 0 | 0.0 | 14.4418 | 0 | [411, 393] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_235429__777.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8070 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_235214__212 | 0 | 0.0 | 16.3224 | 0 | [409, 451] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_235214__212.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8071 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_235231__375 | 0 | 0.0 | 16.2442 | 0 | [409, 448] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_235231__375.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8072 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_235245__935 | 0 | 0.0 | 14.5914 | 0 | [409, 398] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_235245__935.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8073 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_235302__169 | 0 | 0.0 | 16.5361 | 0 | [409, 457] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_235302__169.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8074 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_235316__425 | 0 | 0.0 | 14.2829 | 0 | [409, 389] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_235316__425.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8075 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_235613__964 | 5 | 0.0 | 11.9654 | 4 | [103, 293] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_235613__964.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8076 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_235626__659 | 0 | 0.0 | 12.8726 | 0 | [103, 316] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_235626__659.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8077 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_235640__684 | 0 | 0.0 | 14.2075 | 0 | [103, 350] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_235640__684.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8078 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_235650__359 | 0 | 0.0 | 10.1751 | 0 | [103, 247] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_235650__359.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8079 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_235707__401 | 0 | 0.0 | 16.3844 | 0 | [103, 405] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_235707__401.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8080 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_235530__772 | 0 | 0.0 | 10.7264 | 0 | [144, 256] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_235530__772.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8081 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_235536__700 | 0 | 0.0 | 5.95002 | 0 | [144, 133] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_235536__700.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8082 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_235543__476 | 0 | 0.0 | 7.17292 | 0 | [144, 165] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_235543__476.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8083 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_235549__275 | 3 | 0.0 | 5.94463 | 4 | [144, 133] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_235549__275.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8084 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_235601__277 | 0 | 0.0 | 11.2921 | 0 | [144, 270] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_235601__277.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8085 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_235447__645 | 0 | 0.0 | 18.8293 | 0 | [218, 428] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_235447__645.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8086 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_235456__364 | 0 | 0.0 | 8.82706 | 0 | [218, 197] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_235456__364.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8087 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_235502__806 | 5 | 0.0 | 6.06019 | 4 | [218, 126] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_235502__806.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8088 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_235510__850 | 0 | 0.0 | 7.60673 | 0 | [218, 166] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_235510__850.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8089 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_235520__755 | 0 | 0.0 | 9.61687 | 0 | [218, 217] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_235520__755.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8090 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_235854__790 | 0 | 0.0 | 17.9169 | 0 | [411, 392] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_235854__790.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8091 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_235918__949 | 0 | 0.0 | 23.8077 | 0 | [411, 534] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_235918__949.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8092 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_235932__134 | 0 | 0.0 | 14.0347 | 0 | [411, 297] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_235932__134.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8093 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_235947__611 | 0 | 0.0 | 15.3499 | 0 | [411, 329] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_235947__611.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8094 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231228_000005__779 | 0 | 0.0 | 18.1208 | 0 | [411, 397] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_000005__779.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8095 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_235723__548 | 0 | 0.0 | 16.2363 | 0 | [409, 351] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_235723__548.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8096 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_235741__939 | 0 | 0.0 | 17.7143 | 0 | [409, 387] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_235741__939.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8097 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_235800__614 | 0 | 0.0 | 18.9499 | 0 | [409, 417] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_235800__614.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8098 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_235819__627 | 0 | 0.0 | 19.0657 | 0 | [409, 420] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_235819__627.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8099 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_235836__439 | 0 | 0.0 | 17.3335 | 0 | [409, 378] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_235836__439.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8100 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231226_122451__321 | 0 | 0.0 | 16.7182 | 0 | [100, 303] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_122451__321.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8101 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231226_122508__713 | 0 | 0.0 | 16.1053 | 0 | [100, 292] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_122508__713.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8102 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_122418__471 | 5 | 0.0 | 20.8899 | 4 | [103, 381] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_122418__471.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8103 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | false | 5 | 20231226_122435__267 | 0 | 0.0 | 16.2782 | 0 | [103, 295] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_122435__267.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8104 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_011317__610 | 5 | 0.0 | 15.0399 | 4 | [103, 271] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231227_011317__610.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8105 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231226_122346__135 | 0 | 0.0 | 8.2443 | 0 | [144, 140] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_122346__135.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8106 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231226_122357__638 | 0 | 0.0 | 11.1003 | 0 | [144, 194] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_122357__638.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8107 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_011301__671 | 0 | 0.0 | 8.6966 | 0 | [144, 148] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_011301__671.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8108 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_122327__244 | 0 | 0.0 | 8.59542 | 0 | [218, 139] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_122327__244.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8109 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_122338__315 | 3 | 0.0 | 10.4612 | 4 | [218, 174] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_122338__315.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8110 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_011253__562 | 0 | 0.0 | 17.3439 | 0 | [218, 135] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_011253__562.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8111 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_122624__395 | 0 | 0.0 | 20.5047 | 0 | [411, 336] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_122624__395.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8112 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_122649__841 | 0 | 0.0 | 25.4783 | 0 | [411, 426] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_122649__841.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8113 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_011418__726 | 0 | 0.0 | 33.0579 | 0 | [411, 560] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_011418__726.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8114 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231226_122537__255 | 0 | 0.0 | 29.8301 | 0 | [409, 504] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_122537__255.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8115 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231226_122603__755 | 0 | 0.0 | 25.631 | 0 | [409, 429] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_122603__755.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8116 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_011345__648 | 0 | 0.0 | 28.0065 | 0 | [409, 470] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_011345__648.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8117 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_121400__767 | 0 | 0.0 | 40.7177 | 0 | [107, 219] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_121400__767.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8118 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_121437__422 | 5 | 0.0 | 36.5609 | 4 | [107, 200] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_121437__422.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8119 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_121531__832 | 5 | 0.0 | 53.7717 | 4 | [107, 277] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_121531__832.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8120 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_152649__866 | 0 | 0.0 | 36.7192 | 0 | [107, 207] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_152649__866.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8121 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_152725__524 | 5 | 0.0 | 36.2257 | 4 | [107, 204] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_152725__524.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8122 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_121202__129 | 0 | 0.0 | 30.845 | 0 | [146, 161] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_121202__129.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8123 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_121232__114 | 5 | 0.0 | 30.307 | 4 | [146, 163] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_121232__114.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8124 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_121319__893 | 5 | 0.0 | 47.1066 | 4 | [146, 264] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_121319__893.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8125 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_152537__167 | 5 | 0.0 | 27.7726 | 4 | [146, 147] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_152537__167.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8126 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_152612__960 | 0 | 0.0 | 35.1585 | 0 | [146, 192] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_152612__960.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8127 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_120950__267 | 5 | 0.0 | 65.5192 | 4 | [217, 314] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_120950__267.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8128 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_121042__378 | 5 | 0.0 | 51.8698 | 4 | [217, 267] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_121042__378.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8129 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_121131__236 | 0 | 0.0 | 48.7247 | 0 | [217, 255] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_121131__236.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8130 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_152423__792 | 0 | 0.0 | 58.1812 | 0 | [217, 319] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_152423__792.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8131 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_152509__265 | 5 | 0.0 | 45.7665 | 4 | [217, 245] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_152509__265.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8132 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_121844__791 | 5 | 0.0 | 69.4579 | 4 | [420, 342] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_121844__791.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8133 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_121915__409 | 0 | 0.0 | 30.2009 | 0 | [420, 109] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_121915__409.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8134 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_122057__558 | 0 | 0.0 | 102.366 | 0 | [420, 515] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_122057__558.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8135 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_153059__844 | 0 | 0.0 | 84.8917 | 0 | [420, 434] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_153059__844.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8136 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_153138__904 | 5 | 0.0 | 39.1712 | 4 | [420, 167] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_153138__904.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8137 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_121614__470 | 5 | 0.0 | 42.9976 | 4 | [418, 183] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_121614__470.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8138 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_121656__186 | 0 | 0.0 | 41.7596 | 0 | [418, 176] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_121656__186.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8139 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_121735__361 | 5 | 0.0 | 39.1205 | 4 | [418, 166] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_121735__361.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8140 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_152824__245 | 5 | 0.0 | 59.0104 | 4 | [418, 284] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_152824__245.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8141 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_152934__696 | 0 | 0.0 | 69.439 | 0 | [418, 345] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_152934__696.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8142 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_160035__174 | 0 | 0.0 | 16.9922 | 0 | [109, 423] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231225_160035__174.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8143 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_160046__524 | 0 | 0.0 | 10.8129 | 0 | [109, 265] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231225_160046__524.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8144 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_160008__585 | 5 | 0.0 | 12.002 | 4 | [112, 295] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_160008__585.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8145 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_160018__659 | 5 | 0.0 | 10.5756 | 4 | [112, 259] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_160018__659.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8146 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231227_011110__358 | 0 | 0.0 | 12.7941 | 0 | [112, 314] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231227_011110__358.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8147 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_155946__412 | 0 | 0.0 | 8.13661 | 0 | [153, 191] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_155946__412.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8148 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_155956__722 | 0 | 0.0 | 9.1324 | 0 | [153, 216] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_155956__722.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8149 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_011057__352 | 0 | 0.0 | 9.34527 | 0 | [153, 221] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_011057__352.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8150 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_155928__480 | 5 | 0.0 | 14.0881 | 4 | [227, 166] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_155928__480.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8151 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_155938__553 | 5 | 0.0 | 9.82496 | 4 | [227, 220] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_155938__553.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8152 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_011048__547 | 0 | 0.0 | 20.7439 | 0 | [227, 339] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_011048__547.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8153 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_160138__846 | 0 | 0.0 | 17.3865 | 0 | [420, 378] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_160138__846.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8154 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_160202__206 | 0 | 0.0 | 23.7856 | 4 | [420, 533] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_160202__206.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8155 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_011145__897 | 0 | 0.0 | 18.6938 | 0 | [420, 408] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_011145__897.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8156 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_160104__569 | 0 | 0.0 | 17.539 | 0 | [418, 382] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_160104__569.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8157 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_160121__101 | 0 | 0.0 | 17.1188 | 0 | [418, 372] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_160121__101.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8158 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_011126__399 | 5 | 0.0 | 15.8569 | 4 | [418, 339] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_011126__399.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8159 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231214_073540__142 | 0 | 0.0 | 14.8862 | 0 | [96, 434] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231214_073540__142.json | 0.0 | missing | missing | missing | |
| 8160 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231225_150903__327 | 0 | 0.0 | 8.60555 | 0 | [107, 271] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231225_150903__327.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8161 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231225_150916__619 | 0 | 0.0 | 13.0482 | 0 | [107, 417] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231225_150916__619.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8162 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | InJulia | 1SHOT | false | false | 5 | 20231214_073525__823 | 0 | 0.0 | 23.5377 | 0 | [113, 664] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231214_073525__823.json | 0.0 | missing | missing | missing | |
| 8163 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | InJulia | 1SHOT | false | false | 5 | 20231225_150846__685 | 0 | 0.0 | 7.52348 | 0 | [110, 235] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_150846__685.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8164 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | InJulia | 1SHOT | false | false | 5 | 20231225_150855__348 | 0 | 0.0 | 8.86333 | 0 | [110, 279] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_150855__348.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8165 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | InJulia | 1SHOT | false | false | 5 | 20231227_004535__583 | 0 | 0.0 | 10.678 | 0 | [110, 337] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231227_004535__583.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8166 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_073502__999 | 0 | 0.0 | 12.6661 | 0 | [142, 353] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231214_073502__999.json | 0.0 | missing | missing | missing | |
| 8167 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_150831__637 | 0 | 0.0 | 6.08251 | 0 | [151, 180] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_150831__637.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8168 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_150838__803 | 0 | 0.0 | 7.06358 | 0 | [151, 213] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_150838__803.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8169 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_004524__419 | 0 | 0.0 | 7.57785 | 0 | [151, 230] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231227_004524__419.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8170 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_073449__698 | 0 | 0.0 | 17.5631 | 0 | [217, 462] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231214_073449__698.json | 25.0 | missing | missing | missing | |
| 8171 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_150813__251 | 0 | 0.0 | 10.5998 | 0 | [225, 143] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_150813__251.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8172 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_150825__373 | 0 | 0.0 | 11.8628 | 0 | [225, 354] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_150825__373.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8173 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_004517__953 | 0 | 0.0 | 17.157 | 0 | [225, 363] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231227_004517__953.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8174 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_073626__925 | 0 | 0.0 | 25.0838 | 0 | [11, 659] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231214_073626__925.json | 25.0 | missing | missing | missing | |
| 8175 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_150950__754 | 0 | 0.0 | 10.2712 | 0 | [418, 268] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_150950__754.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8176 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_151006__312 | 0 | 0.0 | 16.2016 | 0 | [418, 452] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_151006__312.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8177 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_004602__946 | 5 | 0.0 | 10.9632 | 4 | [418, 288] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231227_004602__946.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8178 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_073600__896 | 0 | 0.0 | 20.274 | 0 | [413, 449] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231214_073600__896.json | 25.0 | missing | missing | missing | |
| 8179 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_150925__459 | 0 | 0.0 | 8.83616 | 0 | [416, 227] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_150925__459.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8180 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_150939__802 | 0 | 0.0 | 14.1221 | 0 | [416, 393] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_150939__802.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8181 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_004551__218 | 0 | 0.0 | 16.0876 | 0 | [416, 451] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231227_004551__218.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8182 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231214_074815__395 | 0 | 0.0 | 12.1722 | 0 | [96, 356] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__AsIs__1SHOT__20231214_074815__395.json | 0.0 | missing | missing | missing | |
| 8183 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231225_153040__565 | 0 | 0.0 | 2.18934 | 0 | [113, 22] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__AsIs__1SHOT__20231225_153040__565.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8184 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231225_153056__478 | 0 | 0.0 | 15.8903 | 0 | [113, 283] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__AsIs__1SHOT__20231225_153056__478.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8185 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | InJulia | 1SHOT | true | false | 5 | 20231214_074803__733 | 0 | 0.0 | 12.8417 | 0 | [113, 370] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__InJulia__1SHOT__20231214_074803__733.json | 25.0 | missing | missing | missing | |
| 8186 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231225_153029__310 | 0 | 0.0 | 11.9424 | 0 | [116, 209] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__InJulia__1SHOT__20231225_153029__310.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8187 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_153038__804 | 0 | 0.0 | 9.0308 | 0 | [116, 153] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__InJulia__1SHOT__20231225_153038__804.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8188 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231227_005603__263 | 0 | 0.0 | 13.8233 | 0 | [116, 243] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__InJulia__1SHOT__20231227_005603__263.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8189 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_074750__215 | 0 | 0.0 | 9.32874 | 0 | [142, 255] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231214_074750__215.json | 0.0 | missing | missing | missing | |
| 8190 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_153005__150 | 0 | 0.0 | 4.16475 | 0 | [155, 55] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_153005__150.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8191 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_153017__625 | 0 | 0.0 | 11.6633 | 0 | [155, 197] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_153017__625.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8192 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_005550__196 | 0 | 0.0 | 6.20877 | 0 | [155, 94] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231227_005550__196.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8193 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_074740__886 | 0 | 0.0 | 16.2354 | 0 | [217, 426] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231214_074740__886.json | 25.0 | missing | missing | missing | |
| 8194 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_152942__532 | 0 | 0.0 | 27.9905 | 0 | [230, 310] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_152942__532.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8195 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_153001__212 | 0 | 0.0 | 19.601 | 0 | [230, 328] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_153001__212.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8196 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_005543__639 | 0 | 0.0 | 37.8965 | 0 | [230, 493] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231227_005543__639.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8197 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_074913__516 | 0 | 0.0 | 24.3648 | 0 | [11, 643] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231214_074913__516.json | 0.0 | missing | missing | missing | |
| 8198 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_153221__270 | 0 | 0.0 | 26.3958 | 0 | [419, 410] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_153221__270.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8199 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_153247__214 | 0 | 0.0 | 26.239 | 0 | [419, 407] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_153247__214.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8200 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_005650__384 | 0 | 0.0 | 20.4733 | 0 | [419, 305] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231227_005650__384.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8201 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_074849__567 | 0 | 0.0 | 34.082 | 0 | [413, 787] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231214_074849__567.json | 0.0 | missing | missing | missing | |
| 8202 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_153120__953 | 0 | 0.0 | 24.0239 | 0 | [416, 373] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_153120__953.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8203 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_153154__183 | 0 | 0.0 | 33.9153 | 0 | [416, 543] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_153154__183.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8204 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_005629__785 | 0 | 0.0 | 25.8194 | 0 | [416, 402] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231227_005629__785.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8205 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231225_160317__875 | 0 | 0.0 | 7.75287 | 0 | [97, 296] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231225_160317__875.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8206 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231225_160338__391 | 0 | 0.0 | 21.1074 | 0 | [97, 787] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231225_160338__391.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8207 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_160249__543 | 0 | 0.0 | 22.505 | 0 | [100, 835] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_160249__543.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8208 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_160310__234 | 0 | 0.0 | 20.8945 | 0 | [100, 779] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_160310__234.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8209 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_011218__740 | 0 | 0.0 | 14.8652 | 0 | [100, 560] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231227_011218__740.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8210 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_160214__205 | 0 | 0.0 | 6.42365 | 0 | [137, 239] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_160214__205.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8211 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_160226__694 | 0 | 0.0 | 11.8793 | 0 | [137, 446] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_160226__694.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8212 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_011203__560 | 0 | 0.0 | 14.3323 | 0 | [137, 533] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_011203__560.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8213 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_160207__583 | 0 | 0.0 | 4.3899 | 0 | [209, 2] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_160207__583.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8214 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_160208__442 | 0 | 0.0 | 0.915709 | 0 | [209, 11] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_160208__442.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8215 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_011149__288 | 0 | 0.0 | 4.34352 | 0 | [209, 10] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_011149__288.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8216 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_160413__229 | 0 | 0.0 | 8.93036 | 0 | [389, 286] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_160413__229.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8217 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_160424__830 | 0 | 0.0 | 10.8322 | 0 | [389, 354] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_160424__830.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8218 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_011235__982 | 0 | 0.0 | 8.25612 | 0 | [389, 259] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_011235__982.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8219 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_160351__453 | 0 | 0.0 | 12.4 | 0 | [386, 410] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_160351__453.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8220 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_160404__618 | 0 | 0.0 | 12.8588 | 0 | [386, 427] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_160404__618.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8221 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_011227__521 | 0 | 0.0 | 9.0866 | 0 | [386, 290] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_011227__521.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8222 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231214_075021__630 | 0 | 0.0 | 14.9628 | 0 | [96, 436] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231214_075021__630.json | 0.0 | missing | missing | missing | |
| 8223 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231225_153709__484 | 0 | 0.0 | 45.5414 | 0 | [121, 349] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_153709__484.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8224 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231225_153746__622 | 0 | 0.0 | 36.6344 | 0 | [121, 278] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_153746__622.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8225 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | InJulia | 1SHOT | false | false | 5 | 20231214_075006__883 | 0 | 0.0 | 15.5307 | 0 | [113, 446] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231214_075006__883.json | 0.0 | missing | missing | missing | |
| 8226 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | InJulia | 1SHOT | true | false | 5 | 20231225_153545__855 | 0 | 0.0 | 34.2947 | 0 | [124, 259] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_153545__855.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8227 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_153623__841 | 0 | 0.0 | 38.3017 | 0 | [124, 291] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_153623__841.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8228 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | InJulia | 1SHOT | true | false | 5 | 20231227_005918__119 | 0 | 0.0 | 56.7 | 0 | [124, 432] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231227_005918__119.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8229 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_074950__386 | 0 | 0.0 | 17.9148 | 0 | [142, 500] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231214_074950__386.json | 0.0 | missing | missing | missing | |
| 8230 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_153432__726 | 0 | 0.0 | 28.2964 | 0 | [163, 200] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_153432__726.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8231 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_153511__760 | 0 | 0.0 | 38.1692 | 0 | [163, 279] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_153511__760.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8232 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_005821__527 | 0 | 0.0 | 27.4833 | 0 | [163, 193] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231227_005821__527.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8233 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_074932__833 | 0 | 0.0 | 18.7133 | 0 | [217, 494] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_074932__833.json | 0.0 | missing | missing | missing | |
| 8234 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_153334__129 | 5 | 0.0 | 46.8463 | 4 | [238, 160] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_153334__129.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8235 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_153404__372 | 5 | 0.0 | 30.0718 | 4 | [238, 202] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_153404__372.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8236 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_005754__980 | 0 | 0.0 | 64.1516 | 0 | [238, 307] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_005754__980.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8237 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_075120__174 | 0 | 0.0 | 25.8988 | 0 | [11, 679] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_075120__174.json | 50.0 | missing | missing | missing | |
| 8238 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_153947__108 | 0 | 0.0 | 49.5844 | 0 | [427, 318] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_153947__108.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8239 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_154029__485 | 0 | 0.0 | 41.3716 | 0 | [427, 254] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_154029__485.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8240 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_010125__990 | 0 | 0.0 | 62.4156 | 0 | [427, 412] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_010125__990.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8241 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_075054__801 | 0 | 0.0 | 33.3806 | 0 | [413, 771] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231214_075054__801.json | 25.0 | missing | missing | missing | |
| 8242 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_153816__699 | 0 | 0.0 | 30.6252 | 0 | [424, 172] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_153816__699.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8243 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_153857__759 | 0 | 0.0 | 40.6102 | 0 | [424, 250] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_153857__759.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8244 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_010022__451 | 0 | 0.0 | 63.9743 | 0 | [424, 427] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231227_010022__451.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8245 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_155504__289 | 0 | 0.0 | 25.8939 | 0 | [109, 435] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231225_155504__289.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8246 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_155517__167 | 0 | 0.0 | 12.9666 | 0 | [109, 212] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231225_155517__167.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8247 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231225_155417__822 | 0 | 0.0 | 24.834 | 0 | [112, 417] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_155417__822.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8248 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231225_155438__612 | 0 | 0.0 | 20.7802 | 0 | [112, 347] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_155438__612.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8249 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231227_010850__768 | 0 | 0.0 | 21.8488 | 0 | [112, 364] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231227_010850__768.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8250 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_155341__560 | 0 | 0.0 | 14.8303 | 0 | [153, 239] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_155341__560.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8251 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_155353__320 | 0 | 0.0 | 11.3318 | 0 | [153, 178] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_155353__320.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8252 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_010829__777 | 0 | 0.0 | 12.5769 | 0 | [153, 199] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_010829__777.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8253 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_155315__248 | 0 | 0.0 | 20.0453 | 0 | [227, 156] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_155315__248.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8254 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_155326__871 | 0 | 0.0 | 11.4974 | 0 | [227, 167] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_155326__871.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8255 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_010816__296 | 0 | 0.0 | 26.6556 | 0 | [227, 278] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_010816__296.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8256 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_155621__437 | 0 | 0.0 | 20.0642 | 0 | [420, 283] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_155621__437.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8257 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_155652__244 | 0 | 0.0 | 30.7366 | 0 | [420, 459] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_155652__244.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8258 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_010932__808 | 0 | 0.0 | 22.918 | 0 | [420, 329] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_010932__808.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8259 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_155536__412 | 0 | 0.0 | 18.7533 | 0 | [418, 261] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_155536__412.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8260 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_155601__641 | 0 | 0.0 | 25.5785 | 0 | [418, 375] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_155601__641.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8261 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_010909__172 | 0 | 0.0 | 18.9263 | 0 | [418, 263] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_010909__172.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8262 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231214_074634__315 | 0 | 0.0 | 14.0595 | 0 | [96, 410] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__AsIs__1SHOT__20231214_074634__315.json | 0.0 | missing | missing | missing | |
| 8263 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231225_152843__642 | 0 | 0.0 | 4.03141 | 0 | [110, 224] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_152843__642.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8264 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231225_152847__240 | 0 | 0.0 | 4.07055 | 0 | [110, 226] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_152847__240.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8265 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231214_074620__866 | 0 | 0.0 | 14.8885 | 0 | [113, 428] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__InJulia__1SHOT__20231214_074620__866.json | 0.0 | missing | missing | missing | |
| 8266 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231225_152831__179 | 0 | 0.0 | 5.60649 | 0 | [113, 314] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_152831__179.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8267 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231225_152838__318 | 0 | 0.0 | 6.95534 | 0 | [113, 388] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_152838__318.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8268 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231227_005447__944 | 0 | 0.0 | 8.17257 | 0 | [113, 450] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__InJulia__1SHOT__20231227_005447__944.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8269 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_074605__350 | 0 | 0.0 | 9.50168 | 0 | [142, 261] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231214_074605__350.json | 0.0 | missing | missing | missing | |
| 8270 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_152818__378 | 0 | 0.0 | 5.96426 | 0 | [150, 325] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_152818__378.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8271 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_152826__428 | 0 | 0.0 | 8.29983 | 0 | [150, 452] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_152826__428.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8272 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_005439__117 | 0 | 0.0 | 9.60422 | 0 | [150, 515] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231227_005439__117.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8273 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_074555__778 | 0 | 0.0 | 15.3458 | 0 | [217, 402] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231214_074555__778.json | 25.0 | missing | missing | missing | |
| 8274 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_152805__340 | 0 | 0.0 | 10.3507 | 0 | [220, 391] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_152805__340.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8275 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_152812__997 | 0 | 0.0 | 6.21582 | 0 | [220, 322] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_152812__997.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8276 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_005429__995 | 0 | 0.0 | 9.05795 | 0 | [220, 331] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231227_005429__995.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8277 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_074724__911 | 0 | 0.0 | 27.6163 | 0 | [11, 721] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231214_074724__911.json | 0.0 | missing | missing | missing | |
| 8278 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_152908__981 | 0 | 0.0 | 6.26613 | 0 | [400, 275] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_152908__981.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8279 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_152914__724 | 0 | 0.0 | 5.59033 | 0 | [400, 240] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_152914__724.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8280 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_005505__132 | 0 | 0.0 | 7.62989 | 0 | [400, 342] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231227_005505__132.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8281 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_074656__161 | 0 | 0.0 | 22.6454 | 0 | [413, 510] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231214_074656__161.json | 25.0 | missing | missing | missing | |
| 8282 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_152855__499 | 0 | 0.0 | 8.69609 | 0 | [398, 399] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_152855__499.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8283 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_152902__635 | 0 | 0.0 | 6.42482 | 0 | [398, 284] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_152902__635.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8284 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_005458__737 | 0 | 0.0 | 10.7113 | 0 | [398, 494] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231227_005458__737.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8285 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231214_073710__389 | 0 | 0.0 | 16.8684 | 0 | [96, 489] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__AsIs__1SHOT__20231214_073710__389.json | 0.0 | missing | missing | missing | |
| 8286 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231225_151122__794 | 0 | 0.0 | 11.0257 | 0 | [109, 349] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__AsIs__1SHOT__20231225_151122__794.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8287 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231225_151131__492 | 0 | 0.0 | 8.92318 | 0 | [109, 281] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__AsIs__1SHOT__20231225_151131__492.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8288 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | InJulia | 1SHOT | true | false | 5 | 20231214_073653__836 | 0 | 0.0 | 12.5901 | 0 | [113, 363] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__InJulia__1SHOT__20231214_073653__836.json | 25.0 | missing | missing | missing | |
| 8289 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | InJulia | 1SHOT | false | false | 5 | 20231225_151057__332 | 0 | 0.0 | 8.89659 | 0 | [112, 280] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_151057__332.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8290 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_151110__308 | 0 | 0.0 | 13.4202 | 0 | [112, 427] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_151110__308.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8291 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | InJulia | 1SHOT | false | false | 5 | 20231227_004632__406 | 0 | 0.0 | 7.8559 | 0 | [112, 244] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__InJulia__1SHOT__20231227_004632__406.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8292 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_073640__690 | 0 | 0.0 | 6.9744 | 0 | [142, 185] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231214_073640__690.json | 25.0 | missing | missing | missing | |
| 8293 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_151037__824 | 0 | 0.0 | 9.7004 | 0 | [153, 300] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_151037__824.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8294 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_151048__166 | 0 | 0.0 | 11.3251 | 0 | [153, 353] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_151048__166.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8295 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_004624__453 | 0 | 0.0 | 10.7386 | 0 | [153, 332] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231227_004624__453.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8296 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_073633__704 | 0 | 0.0 | 7.43955 | 0 | [217, 176] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231214_073633__704.json | 0.0 | missing | missing | missing | |
| 8297 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_151022__215 | 0 | 0.0 | 15.5728 | 0 | [227, 302] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_151022__215.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8298 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_151027__307 | 0 | 0.0 | 5.23907 | 0 | [227, 137] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_151027__307.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8299 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_004614__739 | 0 | 0.0 | 11.3667 | 0 | [227, 169] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231227_004614__739.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8300 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_073756__157 | 0 | 0.0 | 20.3255 | 0 | [11, 543] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231214_073756__157.json | 25.0 | missing | missing | missing | |
| 8301 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_151207__289 | 0 | 0.0 | 11.6428 | 0 | [420, 311] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_151207__289.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8302 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_151220__379 | 0 | 0.0 | 12.9266 | 0 | [420, 350] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_151220__379.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8303 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_004652__892 | 0 | 0.0 | 10.6362 | 0 | [420, 277] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231227_004652__892.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8304 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_073736__333 | 0 | 0.0 | 26.4857 | 0 | [413, 605] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231214_073736__333.json | 0.0 | missing | missing | missing | |
| 8305 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_151145__988 | 0 | 0.0 | 14.4238 | 0 | [418, 397] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_151145__988.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8306 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_151155__839 | 0 | 0.0 | 10.1827 | 0 | [418, 265] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_151155__839.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8307 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_004642__156 | 0 | 0.0 | 9.52075 | 0 | [418, 242] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231227_004642__156.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8308 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231214_073848__656 | 0 | 0.0 | 9.75522 | 0 | [96, 285] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__AsIs__1SHOT__20231214_073848__656.json | 0.0 | missing | missing | missing | |
| 8309 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231225_151627__398 | 0 | 0.0 | 51.5939 | 0 | [104, 384] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__AsIs__1SHOT__20231225_151627__398.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8310 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231225_151734__328 | 0 | 0.0 | 66.7709 | 0 | [104, 499] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__AsIs__1SHOT__20231225_151734__328.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8311 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231214_073838__120 | 0 | 0.0 | 14.4259 | 0 | [113, 415] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__InJulia__1SHOT__20231214_073838__120.json | 25.0 | missing | missing | missing | |
| 8312 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231225_151446__498 | 5 | 0.0 | 39.8228 | 4 | [107, 293] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_151446__498.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8313 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231225_151535__826 | 0 | 0.0 | 48.8008 | 0 | [107, 362] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_151535__826.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8314 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231227_004855__981 | 0 | 0.0 | 71.9964 | 0 | [107, 537] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__InJulia__1SHOT__20231227_004855__981.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8315 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_073824__554 | 0 | 0.0 | 11.6721 | 0 | [142, 324] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231214_073824__554.json | 25.0 | missing | missing | missing | |
| 8316 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_151341__260 | 0 | 0.0 | 22.9643 | 0 | [146, 156] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_151341__260.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8317 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_151406__948 | 0 | 0.0 | 25.1358 | 0 | [146, 173] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_151406__948.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8318 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_004743__598 | 0 | 0.0 | 21.8247 | 0 | [146, 147] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231227_004743__598.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8319 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_073812__226 | 0 | 0.0 | 15.7743 | 0 | [217, 413] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231214_073812__226.json | 0.0 | missing | missing | missing | |
| 8320 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_151300__422 | 0 | 0.0 | 40.0007 | 0 | [217, 76] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_151300__422.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8321 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_151318__664 | 0 | 0.0 | 18.3943 | 0 | [217, 109] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_151318__664.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8322 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_004721__761 | 0 | 0.0 | 28.381 | 0 | [217, 11] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231227_004721__761.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8323 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_073938__360 | 0 | 0.0 | 25.1037 | 0 | [11, 661] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231214_073938__360.json | 0.0 | missing | missing | missing | |
| 8324 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_152004__909 | 5 | 0.0 | 43.6745 | 4 | [420, 262] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_152004__909.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8325 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_152122__171 | 0 | 0.0 | 77.8006 | 0 | [420, 512] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_152122__171.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8326 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_005151__796 | 0 | 0.0 | 89.9002 | 0 | [420, 597] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231227_005151__796.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8327 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_073913__992 | 0 | 0.0 | 24.5522 | 0 | [413, 557] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231214_073913__992.json | 25.0 | missing | missing | missing | |
| 8328 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_151835__515 | 0 | 0.0 | 61.2711 | 0 | [418, 392] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_151835__515.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8329 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_151920__873 | 5 | 0.0 | 45.3965 | 4 | [418, 275] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_151920__873.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8330 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_005021__313 | 0 | 0.0 | 86.3958 | 0 | [418, 573] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231227_005021__313.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8331 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231214_075659__398 | 0 | 0.0 | 6.49822 | 0 | [52, 197] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231214_075659__398.json | 0.0 | missing | missing | missing | |
| 8332 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231225_115314__689 | 0 | 0.0 | 2.25685 | 0 | [74, 28] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_115314__689.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8333 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231225_115319__591 | 0 | 0.0 | 4.54371 | 0 | [74, 73] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_115319__591.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8334 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | InJulia | 1SHOT | false | false | 5 | 20231214_075653__860 | 0 | 0.0 | 10.7123 | 0 | [69, 320] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231214_075653__860.json | 0.0 | missing | missing | missing | |
| 8335 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_115301__252 | 3 | 0.0 | 10.0062 | 4 | [77, 178] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_115301__252.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8336 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_115312__718 | 1 | 0.0 | 10.6232 | 4 | [77, 190] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_115312__718.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8337 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231227_012209__407 | 3 | 0.0 | 10.9913 | 4 | [77, 196] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231227_012209__407.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8338 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_075642__355 | 1 | 0.0 | 8.59461 | 4 | [98, 246] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231214_075642__355.json | 80.0 | missing | missing | missing | |
| 8339 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_115243__607 | 5 | 0.0 | 12.3716 | 4 | [115, 218] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_115243__607.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8340 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_115251__421 | 3 | 0.0 | 7.50932 | 4 | [115, 125] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_115251__421.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8341 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_012158__733 | 5 | 0.0 | 6.43846 | 4 | [115, 104] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231227_012158__733.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8342 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_075634__369 | 0 | 0.0 | 11.53 | 0 | [188, 305] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231214_075634__369.json | 0.0 | missing | missing | missing | |
| 8343 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_115224__701 | 0 | 0.0 | 14.6442 | 0 | [206, 59] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_115224__701.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8344 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_115231__246 | 2 | 0.0 | 6.39734 | 4 | [206, 88] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_115231__246.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8345 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_012151__721 | 0 | 0.0 | 12.6192 | 0 | [206, 28] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231227_012151__721.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8346 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_075730__304 | 0 | 0.0 | 17.7795 | 0 | [11, 485] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231214_075730__304.json | 0.0 | missing | missing | missing | |
| 8347 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_115407__616 | 0 | 0.0 | 17.4861 | 0 | [380, 265] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_115407__616.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8348 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_115426__317 | 3 | 0.0 | 18.8938 | 4 | [380, 290] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_115426__317.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8349 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_012234__349 | 3 | 0.0 | 16.7666 | 4 | [380, 250] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231227_012234__349.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8350 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_075712__185 | 0 | 0.0 | 12.8798 | 0 | [369, 273] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231214_075712__185.json | 0.0 | missing | missing | missing | |
| 8351 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_115335__184 | 0 | 0.0 | 16.6928 | 0 | [377, 250] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_115335__184.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8352 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_115350__147 | 5 | 0.0 | 14.2006 | 4 | [377, 206] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_115350__147.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8353 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_012217__829 | 5 | 0.0 | 8.20408 | 4 | [377, 95] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231227_012217__829.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8354 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_021927__663 | 1 | 0.0 | 0.498088 | 4 | [0, 38] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_021927__663.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8355 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_021929__840 | 5 | 0.0 | 2.59834 | 4 | [0, 198] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_021929__840.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8356 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_021932__481 | 1 | 0.0 | 2.30679 | 4 | [0, 176] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_021932__481.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8357 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_021933__322 | 5 | 0.0 | 1.62958 | 4 | [0, 125] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_021933__322.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8358 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_021935__422 | 5 | 0.0 | 1.3212 | 4 | [0, 101] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_021935__422.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8359 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_021910__103 | 5 | 0.0 | 0.526287 | 4 | [0, 40] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_021910__103.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8360 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_021910__329 | 5 | 0.0 | 0.552502 | 4 | [0, 42] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_021910__329.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8361 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_021911__613 | 5 | 0.0 | 0.552399 | 4 | [0, 42] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_021911__613.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8362 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_021913__957 | 3 | 0.0 | 2.27791 | 4 | [0, 173] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_021913__957.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8363 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_021914__136 | 5 | 0.0 | 0.52585 | 4 | [0, 40] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_021914__136.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8364 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_021903__392 | 0 | 0.0 | 0.424036 | 0 | [0, 32] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_021903__392.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8365 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_021903__813 | 0 | 0.0 | 0.436212 | 0 | [0, 33] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_021903__813.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8366 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_021904__302 | 0 | 0.0 | 0.43657 | 0 | [0, 33] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_021904__302.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8367 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_021904__441 | 0 | 0.0 | 0.417996 | 0 | [0, 31] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_021904__441.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8368 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_021904__697 | 0 | 0.0 | 0.431488 | 0 | [0, 31] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_021904__697.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8369 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_022021__789 | 2 | 0.0 | 2.24818 | 4 | [0, 163] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_022021__789.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8370 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_022022__306 | 0 | 0.0 | 1.49772 | 0 | [0, 109] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_022022__306.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8371 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_022024__868 | 0 | 0.0 | 1.37711 | 0 | [0, 100] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_022024__868.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8372 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_022026__550 | 2 | 0.0 | 2.26037 | 4 | [0, 164] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_022026__550.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8373 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_022029__456 | 5 | 0.0 | 3.04334 | 4 | [0, 220] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_022029__456.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8374 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_021958__482 | 1 | 0.0 | 2.93178 | 0 | [0, 217] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_021958__482.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8375 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_022000__668 | 0 | 0.0 | 1.67947 | 0 | [0, 125] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_022000__668.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8376 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_022003__126 | 3 | 0.0 | 3.94364 | 4 | [0, 291] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_022003__126.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8377 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_022004__231 | 0 | 0.0 | 0.584064 | 4 | [0, 43] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_022004__231.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8378 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_022005__905 | 0 | 0.0 | 0.5819 | 4 | [0, 43] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_022005__905.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8379 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231214_075815__292 | 0 | 0.0 | 9.10482 | 0 | [52, 277] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__AsIs__1SHOT__20231214_075815__292.json | 0.0 | missing | missing | missing | |
| 8380 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231225_115508__435 | 0 | 0.0 | 8.21952 | 0 | [48, 149] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__AsIs__1SHOT__20231225_115508__435.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8381 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231225_115512__921 | 0 | 0.0 | 3.28933 | 0 | [48, 54] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__AsIs__1SHOT__20231225_115512__921.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8382 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | InJulia | 1SHOT | true | true | 5 | 20231214_075806__360 | 1 | 0.0 | 9.69465 | 0 | [69, 289] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__InJulia__1SHOT__20231214_075806__360.json | 55.0 | missing | missing | missing | |
| 8383 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_115456__608 | 0 | 0.0 | 4.01778 | 0 | [51, 68] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_115456__608.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8384 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_115500__436 | 0 | 0.0 | 4.48762 | 0 | [51, 77] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_115500__436.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8385 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_075756__804 | 1 | 0.0 | 6.53698 | 0 | [98, 184] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231214_075756__804.json | 55.0 | missing | missing | missing | |
| 8386 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_115445__558 | 0 | 0.0 | 2.38713 | 0 | [52, 36] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_115445__558.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8387 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_115452__743 | 0 | 0.0 | 6.92008 | 0 | [52, 124] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_115452__743.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8388 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_075750__401 | 0 | 0.0 | 19.5067 | 0 | [188, 527] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231214_075750__401.json | 0.0 | missing | missing | missing | |
| 8389 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_115439__149 | 0 | 0.0 | 12.7531 | 0 | [81, 37] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_115439__149.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8390 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_115442__757 | 0 | 0.0 | 3.12204 | 0 | [81, 45] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_115442__757.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8391 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_075851__102 | 0 | 0.0 | 19.3053 | 0 | [11, 525] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231214_075851__102.json | 50.0 | missing | missing | missing | |
| 8392 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_115526__376 | 0 | 0.0 | 1.36246 | 0 | [69, 11] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_115526__376.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8393 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_115532__926 | 0 | 0.0 | 6.56186 | 0 | [69, 112] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_115532__926.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8394 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_075832__708 | 0 | 0.0 | 16.6763 | 0 | [369, 375] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231214_075832__708.json | 0.0 | missing | missing | missing | |
| 8395 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_115517__968 | 0 | 0.0 | 5.3688 | 0 | [66, 89] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_115517__968.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8396 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_115525__395 | 0 | 0.0 | 7.39124 | 0 | [66, 128] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_115525__395.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8397 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_022319__618 | 0 | 0.0 | 6.96654 | 0 | [0, 250] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_022319__618.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8398 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_022325__382 | 0 | 0.0 | 6.06739 | 0 | [0, 216] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_022325__382.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8399 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_022331__984 | 3 | 0.0 | 5.29758 | 4 | [0, 192] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_022331__984.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8400 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_022339__157 | 5 | 0.0 | 8.09213 | 4 | [0, 289] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_022339__157.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8401 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_022349__714 | 3 | 0.0 | 10.5943 | 4 | [0, 382] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_022349__714.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8402 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_022236__731 | 5 | 0.0 | 5.68785 | 4 | [0, 201] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_022236__731.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8403 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_022240__293 | 3 | 0.0 | 3.73314 | 4 | [0, 134] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_022240__293.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8404 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_022240__745 | 0 | 0.0 | 0.754837 | 0 | [0, 27] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_022240__745.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8405 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_022241__455 | 0 | 0.0 | 0.759022 | 0 | [0, 27] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_022241__455.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8406 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_022242__157 | 0 | 0.0 | 1.07858 | 0 | [0, 39] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_022242__157.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8407 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_022202__794 | 0 | 0.0 | 1.21969 | 0 | [0, 44] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_022202__794.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8408 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_022209__439 | 5 | 0.0 | 7.48558 | 4 | [0, 269] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_022209__439.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8409 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_022212__442 | 0 | 0.0 | 2.10883 | 0 | [0, 76] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_022212__442.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8410 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_022212__878 | 0 | 0.0 | 0.748979 | 0 | [0, 27] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_022212__878.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8411 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_022213__863 | 0 | 0.0 | 0.749559 | 0 | [0, 27] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_022213__863.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8412 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_022409__930 | 0 | 0.0 | 0.114433 | 0 | [0, 4] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_022409__930.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8413 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_022410__207 | 0 | 0.0 | 0.114441 | 0 | [0, 4] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_022410__207.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8414 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_022414__458 | 1 | 0.0 | 4.69702 | 0 | [0, 168] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_022414__458.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8415 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_022415__693 | 0 | 0.0 | 0.982301 | 0 | [0, 35] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_022415__693.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8416 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_022417__264 | 0 | 0.0 | 1.59644 | 0 | [0, 57] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_022417__264.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8417 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_022403__989 | 0 | 0.0 | 2.97857 | 0 | [0, 105] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_022403__989.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8418 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_022404__589 | 0 | 0.0 | 0.797674 | 0 | [0, 28] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_022404__589.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8419 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_022405__663 | 0 | 0.0 | 0.786475 | 0 | [0, 28] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_022405__663.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8420 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_022407__757 | 4 | 0.0 | 1.8398 | 4 | [0, 66] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_022407__757.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8421 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_022409__362 | 0 | 0.0 | 1.87569 | 0 | [0, 67] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_022409__362.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8422 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 5 | 20240201_021343__287 | 5 | 0.0 | 18.1153 | 4 | [0, 445] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_021343__287.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8423 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | false | 5 | 20240201_021400__960 | 0 | 0.0 | 16.5928 | 0 | [0, 408] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_021400__960.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8424 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 5 | 20240201_021414__598 | 5 | 0.0 | 13.8922 | 4 | [0, 342] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_021414__598.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8425 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240201_021434__428 | 0 | 0.0 | 20.0398 | 0 | [0, 490] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_021434__428.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8426 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | false | 5 | 20240201_021445__641 | 0 | 0.0 | 11.5624 | 0 | [0, 282] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_021445__641.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8427 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_021220__629 | 0 | 0.0 | 10.9163 | 0 | [0, 265] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_021220__629.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8428 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_021224__136 | 0 | 0.0 | 3.94154 | 0 | [0, 96] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_021224__136.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8429 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_021231__196 | 0 | 0.0 | 6.40434 | 0 | [0, 156] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_021231__196.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8430 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_021236__763 | 0 | 0.0 | 5.90347 | 0 | [0, 145] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_021236__763.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8431 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_021246__629 | 0 | 0.0 | 9.06011 | 0 | [0, 223] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_021246__629.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8432 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_021100__648 | 0 | 0.0 | 10.3685 | 0 | [0, 254] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_021100__648.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8433 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_021107__285 | 0 | 0.0 | 6.70808 | 0 | [0, 161] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_021107__285.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8434 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_021125__120 | 5 | 0.0 | 18.4049 | 4 | [0, 438] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_021125__120.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8435 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_021129__687 | 0 | 0.0 | 3.65312 | 0 | [0, 88] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_021129__687.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8436 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_021134__985 | 0 | 0.0 | 5.02935 | 0 | [0, 121] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_021134__985.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8437 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_021734__409 | 0 | 0.0 | 3.69364 | 0 | [0, 90] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_021734__409.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8438 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_021746__734 | 0 | 0.0 | 11.3709 | 0 | [0, 276] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_021746__734.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8439 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_021756__296 | 0 | 0.0 | 10.4692 | 0 | [0, 253] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_021756__296.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8440 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_021829__684 | 0 | 0.0 | 32.9134 | 4 | [0, 790] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_021829__684.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8441 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_021846__637 | 0 | 0.0 | 16.8824 | 0 | [0, 406] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_021846__637.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8442 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_021606__389 | 0 | 0.0 | 36.8497 | 0 | [0, 888] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_021606__389.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8443 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_021621__443 | 5 | 0.0 | 15.3914 | 4 | [0, 373] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_021621__443.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8444 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_021630__257 | 5 | 0.0 | 8.17534 | 4 | [0, 199] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_021630__257.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8445 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_021639__166 | 3 | 0.0 | 9.50498 | 4 | [0, 231] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_021639__166.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8446 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_021645__870 | 0 | 0.0 | 5.5245 | 0 | [0, 135] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_021645__870.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8447 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_020507__587 | 0 | 0.0 | 14.9 | 0 | [0, 279] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_020507__587.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8448 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_020521__445 | 5 | 0.0 | 14.6818 | 4 | [0, 275] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_020521__445.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8449 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_020535__512 | 5 | 0.0 | 13.475 | 4 | [0, 252] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_020535__512.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8450 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_020547__644 | 5 | 0.0 | 11.6901 | 4 | [0, 219] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_020547__644.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8451 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_020557__734 | 5 | 0.0 | 10.4825 | 4 | [0, 196] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_020557__734.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8452 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_020325__603 | 0 | 0.0 | 19.6012 | 0 | [0, 366] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_020325__603.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8453 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_020333__133 | 3 | 0.0 | 7.84461 | 4 | [0, 147] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_020333__133.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8454 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_020342__200 | 5 | 0.0 | 9.1321 | 4 | [0, 171] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_020342__200.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8455 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240201_020353__322 | 0 | 0.0 | 11.5886 | 0 | [0, 217] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_020353__322.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8456 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_020400__534 | 0 | 0.0 | 6.72043 | 0 | [0, 126] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_020400__534.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8457 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_020133__683 | 5 | 0.0 | 12.2145 | 4 | [0, 228] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_020133__683.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8458 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_020141__311 | 5 | 0.0 | 8.03119 | 4 | [0, 150] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_020141__311.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8459 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_020150__292 | 1 | 0.0 | 9.052 | 4 | [0, 169] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_020150__292.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8460 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_020202__548 | 0 | 0.0 | 11.8454 | 0 | [0, 221] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_020202__548.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8461 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_020208__134 | 5 | 0.0 | 6.63634 | 4 | [0, 124] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_020208__134.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8462 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_020853__635 | 5 | 0.0 | 9.12789 | 4 | [0, 168] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_020853__635.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8463 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_020902__874 | 3 | 0.0 | 9.42524 | 4 | [0, 173] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_020902__874.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8464 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_020911__555 | 3 | 0.0 | 8.65819 | 4 | [0, 160] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_020911__555.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8465 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_020923__945 | 0 | 0.0 | 12.1518 | 0 | [0, 222] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_020923__945.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8466 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_020935__166 | 0 | 0.0 | 11.9779 | 0 | [0, 221] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_020935__166.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8467 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_020710__401 | 5 | 0.0 | 11.0213 | 4 | [0, 204] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_020710__401.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8468 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_020719__559 | 0 | 0.0 | 8.40114 | 0 | [0, 156] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_020719__559.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8469 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_020727__350 | 0 | 0.0 | 7.85367 | 0 | [0, 146] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_020727__350.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8470 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_020740__646 | 5 | 0.0 | 13.4999 | 4 | [0, 250] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_020740__646.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8471 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_020748__273 | 4 | 0.0 | 7.31466 | 4 | [0, 136] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_020748__273.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8472 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_022046__594 | 0 | 0.0 | 0.25406 | 0 | [0, 31] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_022046__594.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8473 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_022046__912 | 0 | 0.0 | 0.254596 | 0 | [0, 31] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_022046__912.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8474 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_022047__313 | 0 | 0.0 | 0.256888 | 0 | [0, 31] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_022047__313.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8475 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_022047__858 | 0 | 0.0 | 0.629397 | 0 | [0, 77] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_022047__858.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8476 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_022048__157 | 0 | 0.0 | 0.63078 | 0 | [0, 77] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_022048__157.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8477 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_022041__254 | 0 | 0.0 | 0.240235 | 0 | [0, 29] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_022041__254.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8478 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_022042__390 | 0 | 0.0 | 0.224812 | 0 | [0, 27] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_022042__390.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8479 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_022042__568 | 1 | 0.0 | 1.44607 | 0 | [0, 175] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_022042__568.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8480 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_022043__128 | 0 | 0.0 | 0.240372 | 0 | [0, 29] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_022043__128.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8481 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_022043__785 | 0 | 0.0 | 0.223404 | 0 | [0, 27] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_022043__785.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8482 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_022037__823 | 0 | 0.0 | 0.234267 | 0 | [0, 28] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_022037__823.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8483 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_022038__153 | 0 | 0.0 | 0.307884 | 0 | [0, 37] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_022038__153.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8484 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_022038__251 | 0 | 0.0 | 0.307182 | 0 | [0, 37] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_022038__251.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8485 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_022038__262 | 0 | 0.0 | 0.225325 | 0 | [0, 27] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_022038__262.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8486 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_022038__571 | 0 | 0.0 | 0.307754 | 0 | [0, 37] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_022038__571.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8487 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_022124__780 | 5 | 0.0 | 1.96968 | 4 | [0, 236] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_022124__780.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8488 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_022125__145 | 1 | 0.0 | 1.19321 | 0 | [0, 143] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_022125__145.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8489 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_022128__714 | 1 | 0.0 | 2.66001 | 0 | [0, 317] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_022128__714.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8490 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_022130__353 | 1 | 0.0 | 1.85934 | 0 | [0, 222] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_022130__353.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8491 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_022132__898 | 5 | 0.0 | 2.28973 | 4 | [0, 270] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_022132__898.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8492 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20240201_022100__330 | 0 | 0.0 | 2.2339 | 0 | [0, 257] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_022100__330.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8493 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_022101__952 | 1 | 0.0 | 1.93092 | 4 | [0, 222] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_022101__952.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8494 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_022104__789 | 0 | 0.0 | 2.91429 | 0 | [0, 334] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_022104__789.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8495 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_022106__812 | 1 | 0.0 | 2.0135 | 4 | [0, 232] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_022106__812.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8496 | NVIDIA-RTX-4090-4x | keep_only_names | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_022108__687 | 5 | 0.0 | 1.4331 | 4 | [0, 163] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_022108__687.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8497 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_121147__102 | 0 | 0.0 | 29.433 | 0 | [67, 173] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_121147__102.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8498 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_121208__953 | 0 | 0.0 | 20.8432 | 0 | [67, 119] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_121208__953.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8499 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_121031__759 | 5 | 0.0 | 22.4166 | 4 | [70, 129] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_121031__759.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8500 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_121117__511 | 1 | 0.0 | 45.1029 | 0 | [70, 272] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_121117__511.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8501 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_013007__626 | 5 | 0.0 | 32.9127 | 4 | [70, 195] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_013007__626.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8502 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_120946__321 | 0 | 0.0 | 32.2184 | 0 | [111, 185] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_120946__321.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8503 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_121009__470 | 0 | 0.0 | 22.6671 | 0 | [111, 125] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_121009__470.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8504 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_012934__902 | 0 | 0.0 | 8.15728 | 0 | [111, 33] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_012934__902.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8505 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_120844__611 | 0 | 0.0 | 52.634 | 0 | [201, 136] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_120844__611.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8506 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_120914__210 | 0 | 0.0 | 30.0926 | 0 | [201, 156] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_120914__210.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8507 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_012926__418 | 0 | 0.0 | 52.6361 | 0 | [201, 151] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_012926__418.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8508 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_121406__310 | 4 | 0.0 | 35.7423 | 4 | [399, 159] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_121406__310.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8509 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_121446__568 | 0 | 0.0 | 40.2936 | 0 | [399, 187] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_121446__568.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8510 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_013125__337 | 3 | 0.0 | 46.6654 | 4 | [399, 225] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_013125__337.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8511 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_121257__530 | 5 | 0.0 | 49.4935 | 4 | [397, 243] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_121257__530.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8512 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_121330__115 | 3 | 0.0 | 32.3043 | 4 | [397, 138] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_121330__115.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8513 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_013038__836 | 3 | 0.0 | 30.2153 | 4 | [397, 125] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_013038__836.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8514 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_013649__443 | 0 | 0.0 | 5.20229 | 0 | [71, 199] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_013649__443.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8515 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_122137__592 | 0 | 0.0 | 8.17088 | 0 | [71, 315] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_122137__592.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8516 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_122143__936 | 0 | 0.0 | 5.28845 | 0 | [71, 203] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_122143__936.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8517 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_122150__591 | 0 | 0.0 | 7.48893 | 0 | [71, 287] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_122150__591.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8518 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_013644__224 | 0 | 0.0 | 6.33872 | 0 | [108, 239] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_013644__224.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8519 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_122118__686 | 0 | 0.0 | 5.14139 | 0 | [108, 191] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_122118__686.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8520 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_122124__174 | 3 | 0.0 | 5.51818 | 4 | [108, 203] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_122124__174.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8521 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_122129__552 | 0 | 0.0 | 5.12059 | 0 | [108, 191] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_122129__552.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8522 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_013637__206 | 0 | 0.0 | 5.12179 | 0 | [196, 44] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_013637__206.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8523 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_122106__669 | 0 | 0.0 | 8.33083 | 0 | [196, 165] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_122106__669.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8524 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_122107__715 | 0 | 0.0 | 1.78343 | 0 | [196, 46] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_122107__715.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8525 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_122113__909 | 0 | 0.0 | 4.97684 | 0 | [196, 171] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_122113__909.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8526 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_013658__454 | 0 | 0.0 | 4.95156 | 0 | [360, 143] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_013658__454.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8527 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_122231__103 | 0 | 0.0 | 10.6299 | 4 | [360, 351] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_122231__103.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8528 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_122241__648 | 0 | 0.0 | 9.9592 | 0 | [360, 326] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_122241__648.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8529 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_122248__788 | 0 | 0.0 | 7.34004 | 0 | [360, 230] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_122248__788.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8530 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_013653__699 | 0 | 0.0 | 4.00686 | 0 | [357, 108] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_013653__699.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8531 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_122203__144 | 1 | 0.0 | 13.01 | 0 | [357, 434] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_122203__144.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8532 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_122214__348 | 0 | 0.0 | 10.071 | 0 | [357, 331] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_122214__348.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8533 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_122220__427 | 0 | 0.0 | 6.53903 | 0 | [357, 202] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_122220__427.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8534 | Apple-MacBook-Pro-M1 | keep_only_names | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_111205__191 | 0 | 0.0 | 2.40042 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_111205__191.json | 50.0 | missing | missing | missing | |
| 8535 | Apple-MacBook-Pro-M1 | keep_only_names | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_111209__646 | 0 | 0.0 | 3.89512 | 4 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_111209__646.json | 75.0 | missing | missing | missing | |
| 8536 | Apple-MacBook-Pro-M1 | keep_only_names | gemini-1.0-pro-latest | InJulia | 1SHOT | true | false | 5 | 20240217_111211__813 | 0 | 0.0 | 2.06966 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_111211__813.json | 25.0 | missing | missing | missing | |
| 8537 | Apple-MacBook-Pro-M1 | keep_only_names | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_111215__296 | 3 | 0.0 | 3.30383 | 4 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_111215__296.json | 90.0 | missing | missing | missing | |
| 8538 | Apple-MacBook-Pro-M1 | keep_only_names | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_111217__150 | 1 | 0.0 | 2.30729 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_111217__150.json | 55.0 | missing | missing | missing | |
| 8539 | Apple-MacBook-Pro-M1 | keep_only_names | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_111139__465 | 1 | 0.0 | 1.92725 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_111139__465.json | 55.0 | missing | missing | missing | |
| 8540 | Apple-MacBook-Pro-M1 | keep_only_names | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_111142__502 | 1 | 0.0 | 2.4115 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_111142__502.json | 55.0 | missing | missing | missing | |
| 8541 | Apple-MacBook-Pro-M1 | keep_only_names | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_111144__943 | 1 | 0.0 | 2.44876 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_111144__943.json | 55.0 | missing | missing | missing | |
| 8542 | Apple-MacBook-Pro-M1 | keep_only_names | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_111147__945 | 1 | 0.0 | 2.7603 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_111147__945.json | 55.0 | missing | missing | missing | |
| 8543 | Apple-MacBook-Pro-M1 | keep_only_names | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_111150__619 | 0 | 0.0 | 2.59727 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_111150__619.json | 50.0 | missing | missing | missing | |
| 8544 | Apple-MacBook-Pro-M1 | keep_only_names | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240217_111112__659 | 5 | 0.0 | 3.09124 | 4 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_111112__659.json | 100.0 | missing | missing | missing | |
| 8545 | Apple-MacBook-Pro-M1 | keep_only_names | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240217_111116__633 | 0 | 0.0 | 3.2925 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_111116__633.json | 0.0 | missing | missing | missing | |
| 8546 | Apple-MacBook-Pro-M1 | keep_only_names | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240217_111118__720 | 3 | 0.0 | 2.20026 | 4 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_111118__720.json | 90.0 | missing | missing | missing | |
| 8547 | Apple-MacBook-Pro-M1 | keep_only_names | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240217_111121__464 | 2 | 0.0 | 2.63879 | 4 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_111121__464.json | 85.0 | missing | missing | missing | |
| 8548 | Apple-MacBook-Pro-M1 | keep_only_names | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240217_111125__986 | 0 | 0.0 | 4.24024 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_111125__986.json | 25.0 | missing | missing | missing | |
| 8549 | Apple-MacBook-Pro-M1 | keep_only_names | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240217_111316__884 | 0 | 0.0 | 3.19999 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_111316__884.json | 25.0 | missing | missing | missing | |
| 8550 | Apple-MacBook-Pro-M1 | keep_only_names | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240217_111318__599 | 1 | 0.0 | 2.74433 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_111318__599.json | 55.0 | missing | missing | missing | |
| 8551 | Apple-MacBook-Pro-M1 | keep_only_names | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240217_111321__132 | 0 | 0.0 | 2.08655 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_111321__132.json | 25.0 | missing | missing | missing | |
| 8552 | Apple-MacBook-Pro-M1 | keep_only_names | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240217_111324__621 | 0 | 0.0 | 3.21855 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_111324__621.json | 25.0 | missing | missing | missing | |
| 8553 | Apple-MacBook-Pro-M1 | keep_only_names | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240217_111326__942 | 0 | 0.0 | 1.89873 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_111326__942.json | 25.0 | missing | missing | missing | |
| 8554 | Apple-MacBook-Pro-M1 | keep_only_names | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | false | 5 | 20240217_111243__104 | 0 | 0.0 | 4.05335 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_111243__104.json | 25.0 | missing | missing | missing | |
| 8555 | Apple-MacBook-Pro-M1 | keep_only_names | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_111245__685 | 0 | 0.0 | 2.0692 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_111245__685.json | 50.0 | missing | missing | missing | |
| 8556 | Apple-MacBook-Pro-M1 | keep_only_names | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | false | 5 | 20240217_111252__322 | 0 | 0.0 | 6.85583 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_111252__322.json | 25.0 | missing | missing | missing | |
| 8557 | Apple-MacBook-Pro-M1 | keep_only_names | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20240217_111256__382 | 0 | 0.0 | 4.04 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_111256__382.json | 0.0 | missing | missing | missing | |
| 8558 | Apple-MacBook-Pro-M1 | keep_only_names | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_111258__668 | 3 | 0.0 | 1.8739 | 4 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_111258__668.json | 90.0 | missing | missing | missing | |
| 8559 | Apple-MacBook-Pro-M1 | keep_only_names | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | false | 5 | 20240223_234044__476 | 0 | 0.0 | 11.8326 | 0 | [0, 180] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_234044__476.json | 25.0 | missing | missing | missing | |
| 8560 | Apple-MacBook-Pro-M1 | keep_only_names | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | false | 5 | 20240223_234107__742 | 0 | 0.0 | 23.5735 | 0 | [0, 364] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_234107__742.json | 25.0 | missing | missing | missing | |
| 8561 | Apple-MacBook-Pro-M1 | keep_only_names | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | false | 5 | 20240223_234119__914 | 0 | 0.0 | 11.6862 | 0 | [0, 177] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_234119__914.json | 25.0 | missing | missing | missing | |
| 8562 | Apple-MacBook-Pro-M1 | keep_only_names | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 5 | 20240223_234134__247 | 0 | 0.0 | 14.9547 | 0 | [0, 228] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_234134__247.json | 0.0 | missing | missing | missing | |
| 8563 | Apple-MacBook-Pro-M1 | keep_only_names | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | false | 5 | 20240223_234149__830 | 0 | 0.0 | 14.9459 | 0 | [0, 227] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_234149__830.json | 25.0 | missing | missing | missing | |
| 8564 | Apple-MacBook-Pro-M1 | keep_only_names | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240223_233847__838 | 1 | 0.0 | 10.5873 | 0 | [0, 162] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_233847__838.json | 55.0 | missing | missing | missing | |
| 8565 | Apple-MacBook-Pro-M1 | keep_only_names | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240223_233902__443 | 1 | 0.0 | 14.7009 | 0 | [0, 225] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_233902__443.json | 55.0 | missing | missing | missing | |
| 8566 | Apple-MacBook-Pro-M1 | keep_only_names | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240223_233904__772 | 1 | 0.0 | 2.03017 | 0 | [0, 32] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_233904__772.json | 55.0 | missing | missing | missing | |
| 8567 | Apple-MacBook-Pro-M1 | keep_only_names | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240223_233912__129 | 1 | 0.0 | 8.24882 | 0 | [0, 125] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_233912__129.json | 55.0 | missing | missing | missing | |
| 8568 | Apple-MacBook-Pro-M1 | keep_only_names | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240223_233915__957 | 0 | 0.0 | 3.04416 | 0 | [0, 46] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_233915__957.json | 50.0 | missing | missing | missing | |
| 8569 | Apple-MacBook-Pro-M1 | keep_only_names | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240223_233613__599 | 0 | 0.0 | 22.7177 | 0 | [0, 347] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_233613__599.json | 25.0 | missing | missing | missing | |
| 8570 | Apple-MacBook-Pro-M1 | keep_only_names | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240223_233634__462 | 0 | 0.0 | 20.6621 | 0 | [0, 317] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_233634__462.json | 25.0 | missing | missing | missing | |
| 8571 | Apple-MacBook-Pro-M1 | keep_only_names | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240223_233658__322 | 0 | 0.0 | 23.7693 | 0 | [0, 360] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_233658__322.json | 25.0 | missing | missing | missing | |
| 8572 | Apple-MacBook-Pro-M1 | keep_only_names | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240223_233717__803 | 0 | 0.0 | 19.0507 | 0 | [0, 290] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_233717__803.json | 25.0 | missing | missing | missing | |
| 8573 | Apple-MacBook-Pro-M1 | keep_only_names | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240223_233742__982 | 0 | 0.0 | 24.7291 | 0 | [0, 380] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_233742__982.json | 25.0 | missing | missing | missing | |
| 8574 | Apple-MacBook-Pro-M1 | keep_only_names | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_234746__827 | 5 | 0.0 | 20.6616 | 4 | [0, 310] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_234746__827.json | 100.0 | missing | missing | missing | |
| 8575 | Apple-MacBook-Pro-M1 | keep_only_names | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_234803__660 | 1 | 0.0 | 17.0293 | 0 | [0, 258] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_234803__660.json | 55.0 | missing | missing | missing | |
| 8576 | Apple-MacBook-Pro-M1 | keep_only_names | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_234821__233 | 0 | 0.0 | 18.321 | 0 | [0, 279] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_234821__233.json | 50.0 | missing | missing | missing | |
| 8577 | Apple-MacBook-Pro-M1 | keep_only_names | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_234843__817 | 1 | 0.0 | 21.1654 | 0 | [0, 322] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_234843__817.json | 55.0 | missing | missing | missing | |
| 8578 | Apple-MacBook-Pro-M1 | keep_only_names | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240223_234901__143 | 1 | 0.0 | 18.032 | 4 | [0, 277] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240223_234901__143.json | 80.0 | missing | missing | missing | |
| 8579 | Apple-MacBook-Pro-M1 | keep_only_names | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240223_234408__874 | 0 | 0.0 | 27.3338 | 0 | [0, 412] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_234408__874.json | 50.0 | missing | missing | missing | |
| 8580 | Apple-MacBook-Pro-M1 | keep_only_names | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240223_234429__174 | 1 | 0.0 | 20.8236 | 0 | [0, 318] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_234429__174.json | 55.0 | missing | missing | missing | |
| 8581 | Apple-MacBook-Pro-M1 | keep_only_names | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240223_234452__282 | 1 | 0.0 | 23.1053 | 0 | [0, 347] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_234452__282.json | 55.0 | missing | missing | missing | |
| 8582 | Apple-MacBook-Pro-M1 | keep_only_names | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240223_234514__993 | 0 | 0.0 | 21.2405 | 0 | [0, 324] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_234514__993.json | 0.0 | missing | missing | missing | |
| 8583 | Apple-MacBook-Pro-M1 | keep_only_names | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240223_234533__532 | 5 | 0.0 | 18.8077 | 4 | [0, 283] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240223_234533__532.json | 100.0 | missing | missing | missing | |
| 8584 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 5 | 20231213_203244__676 | 0 | 0.000356 | 7.68303 | 0 | [61, 217] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231213_203244__676.json | 0.0 | missing | missing | missing | |
| 8585 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 5 | 20231225_194342__988 | 0 | 0.000302 | 3.11664 | 0 | [61, 181] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_194342__988.json | 0.0 | missing | missing | missing | |
| 8586 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 5 | 20231225_194344__180 | 0 | 0.0001895 | 2.28231 | 0 | [61, 106] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_194344__180.json | 0.0 | missing | missing | missing | |
| 8587 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo--optim | AsIs | 1SHOT | false | false | 5 | 20231215_195546__939 | 0 | 0.0 | 4.40246 | 0 | [61, 214] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231215_195546__939.json | 0.0 | 0.5 | missing | 0.5 | |
| 8588 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231213_203236__868 | 0 | 0.000233 | 3.63352 | 0 | [64, 134] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231213_203236__868.json | 50.0 | missing | missing | missing | |
| 8589 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231225_194335__863 | 5 | 0.0002585 | 2.90681 | 4 | [64, 151] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_194335__863.json | 100.0 | missing | missing | missing | |
| 8590 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231225_194339__678 | 5 | 0.0003455 | 3.08631 | 4 | [64, 209] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_194339__678.json | 100.0 | missing | missing | missing | |
| 8591 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231227_201810__283 | 5 | 0.0002525 | 2.75361 | 4 | [64, 147] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_201810__283.json | 100.0 | missing | missing | missing | |
| 8592 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231227_201812__873 | 5 | 0.0001715 | 1.74801 | 4 | [64, 93] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_201812__873.json | 100.0 | missing | missing | missing | |
| 8593 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo--optim | InJulia | 1SHOT | true | true | 5 | 20231215_195541__619 | 5 | 0.0 | 4.5403 | 4 | [64, 203] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231215_195541__619.json | 100.0 | 0.5 | missing | 0.5 | |
| 8594 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_203232__206 | 1 | 8.7e-5 | 1.14312 | 0 | [99, 25] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231213_203232__206.json | 55.0 | missing | missing | missing | |
| 8595 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_194331__268 | 5 | 0.0002715 | 2.56343 | 4 | [99, 148] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_194331__268.json | 100.0 | missing | missing | missing | |
| 8596 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_194332__669 | 5 | 0.0001665 | 1.71171 | 4 | [99, 78] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_194332__669.json | 100.0 | missing | missing | missing | |
| 8597 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_201805__928 | 1 | 8.85e-5 | 0.678964 | 0 | [99, 26] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_201805__928.json | 55.0 | missing | missing | missing | |
| 8598 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_201807__658 | 5 | 0.000177 | 1.74973 | 4 | [99, 85] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_201807__658.json | 100.0 | missing | missing | missing | |
| 8599 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_195536__581 | 5 | 0.0 | 1.02801 | 4 | [99, 25] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231215_195536__581.json | 100.0 | 0.5 | missing | 0.5 | |
| 8600 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_203231__232 | 0 | 0.000199 | 2.44276 | 0 | [173, 75] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231213_203231__232.json | 0.0 | missing | missing | missing | |
| 8601 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_194326__360 | 5 | 0.000274 | 2.27934 | 4 | [173, 125] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_194326__360.json | 100.0 | missing | missing | missing | |
| 8602 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_194328__339 | 0 | 0.0002515 | 1.83051 | 0 | [173, 110] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_194328__339.json | 0.0 | missing | missing | missing | |
| 8603 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_201801__474 | 0 | 0.0002065 | 1.92518 | 0 | [173, 80] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_201801__474.json | 0.0 | missing | missing | missing | |
| 8604 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_201804__411 | 5 | 0.0003355 | 3.3618 | 4 | [173, 166] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_201804__411.json | 100.0 | missing | missing | missing | |
| 8605 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo--optim | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231215_195535__361 | 0 | 0.0 | 1.54121 | 0 | [173, 55] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231215_195535__361.json | 0.0 | 0.5 | missing | 0.5 | |
| 8606 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_203251__133 | 1 | 0.000367 | 5.20353 | 0 | [323, 137] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231213_203251__133.json | 55.0 | missing | missing | missing | |
| 8607 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_194352__969 | 0 | 0.000361 | 2.19775 | 0 | [323, 133] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_194352__969.json | 0.0 | missing | missing | missing | |
| 8608 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_194354__209 | 0 | 0.0002365 | 1.24145 | 0 | [323, 50] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_194354__209.json | 0.0 | missing | missing | missing | |
| 8609 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_201817__136 | 0 | 0.0002485 | 2.07402 | 0 | [323, 58] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_201817__136.json | 0.0 | missing | missing | missing | |
| 8610 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_201821__890 | 5 | 0.0004825 | 3.50197 | 4 | [323, 214] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_201821__890.json | 100.0 | missing | missing | missing | |
| 8611 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_195552__797 | 1 | 0.0 | 4.7359 | 0 | [323, 236] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231215_195552__797.json | 55.0 | 0.5 | missing | 0.5 | |
| 8612 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_203246__441 | 0 | 0.0002285 | 1.94047 | 0 | [322, 45] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231213_203246__441.json | 0.0 | missing | missing | missing | |
| 8613 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_194349__862 | 1 | 0.0004625 | 4.50357 | 0 | [322, 201] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_194349__862.json | 55.0 | missing | missing | missing | |
| 8614 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_194350__867 | 0 | 0.0002645 | 1.4451 | 0 | [322, 69] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_194350__867.json | 0.0 | missing | missing | missing | |
| 8615 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_201814__919 | 0 | 0.0003575 | 2.1641 | 0 | [322, 131] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_201814__919.json | 0.0 | missing | missing | missing | |
| 8616 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_201815__453 | 0 | 0.000245 | 1.53634 | 0 | [322, 56] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_201815__453.json | 0.0 | missing | missing | missing | |
| 8617 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo--optim | JuliaRecapTask | 1SHOT | false | false | 5 | 20231215_195547__283 | 0 | 0.0 | 1.4059 | 0 | [322, 44] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231215_195547__283.json | 0.0 | 0.5 | missing | 0.5 | |
| 8618 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200537__679 | 5 | 0.0002015 | 1.13387 | 4 | [64, 113] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200537__679.json | 100.0 | missing | missing | missing | |
| 8619 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200538__198 | 5 | 0.0002375 | 1.06048 | 4 | [64, 137] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200538__198.json | 100.0 | missing | missing | missing | |
| 8620 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200540__422 | 5 | 0.0002315 | 1.30731 | 4 | [64, 133] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200540__422.json | 100.0 | missing | missing | missing | |
| 8621 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200541__479 | 5 | 0.000197 | 0.897183 | 4 | [64, 110] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200541__479.json | 100.0 | missing | missing | missing | |
| 8622 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200542__948 | 5 | 0.0002165 | 1.08559 | 4 | [64, 123] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200542__948.json | 100.0 | missing | missing | missing | |
| 8623 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200534__206 | 5 | 8.7e-5 | 0.456195 | 4 | [99, 25] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200534__206.json | 100.0 | missing | missing | missing | |
| 8624 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200534__335 | 5 | 9.3e-5 | 0.50341 | 4 | [99, 29] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200534__335.json | 100.0 | missing | missing | missing | |
| 8625 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200535__249 | 5 | 9.3e-5 | 0.46454 | 4 | [99, 29] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200535__249.json | 100.0 | missing | missing | missing | |
| 8626 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200535__639 | 5 | 9.3e-5 | 0.636804 | 4 | [99, 29] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200535__639.json | 100.0 | missing | missing | missing | |
| 8627 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200536__438 | 5 | 8.7e-5 | 0.621325 | 4 | [99, 25] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200536__438.json | 100.0 | missing | missing | missing | |
| 8628 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200529__303 | 5 | 0.000346 | 1.5217 | 4 | [173, 173] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200529__303.json | 100.0 | missing | missing | missing | |
| 8629 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200530__871 | 5 | 0.0001855 | 0.730821 | 4 | [173, 66] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200530__871.json | 100.0 | missing | missing | missing | |
| 8630 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_200530__913 | 0 | 0.000214 | 0.847252 | 0 | [173, 85] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200530__913.json | 0.0 | missing | missing | missing | |
| 8631 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200532__194 | 5 | 0.0003715 | 1.49421 | 4 | [173, 190] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200532__194.json | 100.0 | missing | missing | missing | |
| 8632 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200533__383 | 5 | 0.000241 | 1.08676 | 4 | [173, 103] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200533__383.json | 100.0 | missing | missing | missing | |
| 8633 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_200547__824 | 3 | 0.0002155 | 0.75368 | 4 | [323, 36] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200547__824.json | 90.0 | missing | missing | missing | |
| 8634 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200547__979 | 0 | 0.0002965 | 0.785256 | 0 | [323, 90] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200547__979.json | 0.0 | missing | missing | missing | |
| 8635 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200548__291 | 0 | 0.00022 | 0.490398 | 0 | [323, 39] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200548__291.json | 0.0 | missing | missing | missing | |
| 8636 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_200548__800 | 0 | 0.0002185 | 0.528635 | 0 | [323, 38] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200548__800.json | 50.0 | missing | missing | missing | |
| 8637 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200550__906 | 0 | 0.0002995 | 1.09331 | 0 | [323, 92] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200550__906.json | 0.0 | missing | missing | missing | |
| 8638 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200542__567 | 3 | 0.000212 | 0.471149 | 4 | [322, 34] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200542__567.json | 90.0 | missing | missing | missing | |
| 8639 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200543__368 | 3 | 0.000347 | 1.06439 | 4 | [322, 124] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200543__368.json | 90.0 | missing | missing | missing | |
| 8640 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200544__584 | 5 | 0.0002945 | 0.828884 | 4 | [322, 89] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200544__584.json | 100.0 | missing | missing | missing | |
| 8641 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200545__317 | 4 | 0.0002195 | 0.489082 | 4 | [322, 39] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200545__317.json | 95.0 | missing | missing | missing | |
| 8642 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200546__326 | 3 | 0.000335 | 1.02908 | 4 | [322, 116] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200546__326.json | 90.0 | missing | missing | missing | |
| 8643 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 5 | 20231213_203259__967 | 0 | 0.000283 | 2.80075 | 0 | [61, 111] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231213_203259__967.json | 0.0 | missing | missing | missing | |
| 8644 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 5 | 20231225_194403__670 | 0 | 0.000245 | 1.52445 | 0 | [61, 92] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_194403__670.json | 0.0 | missing | missing | missing | |
| 8645 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 5 | 20231225_194405__319 | 0 | 0.000355 | 1.86986 | 0 | [61, 147] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_194405__319.json | 0.0 | missing | missing | missing | |
| 8646 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106--optim | AsIs | 1SHOT | false | false | 5 | 20231215_195601__106 | 0 | 0.0 | 2.78902 | 0 | [61, 130] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231215_195601__106.json | 0.0 | 0.9 | missing | 0.1 | |
| 8647 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231213_203256__351 | 5 | 0.000344 | 2.99437 | 4 | [64, 140] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231213_203256__351.json | 100.0 | missing | missing | missing | |
| 8648 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231225_194359__378 | 5 | 0.000304 | 1.73019 | 4 | [64, 120] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_194359__378.json | 100.0 | missing | missing | missing | |
| 8649 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231225_194402__929 | 5 | 0.00035 | 1.98007 | 4 | [64, 143] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_194402__929.json | 100.0 | missing | missing | missing | |
| 8650 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231227_201829__498 | 5 | 0.000318 | 2.43653 | 4 | [64, 127] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_201829__498.json | 100.0 | missing | missing | missing | |
| 8651 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231227_201831__541 | 5 | 0.000302 | 2.41747 | 4 | [64, 119] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_201831__541.json | 100.0 | missing | missing | missing | |
| 8652 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106--optim | InJulia | 1SHOT | true | true | 5 | 20231215_195558__248 | 5 | 0.0 | 3.07379 | 4 | [64, 131] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231215_195558__248.json | 100.0 | 0.9 | missing | 0.1 | |
| 8653 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_203253__416 | 5 | 0.000147 | 0.621745 | 4 | [99, 24] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231213_203253__416.json | 100.0 | missing | missing | missing | |
| 8654 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_194357__932 | 1 | 0.000149 | 0.555498 | 0 | [99, 25] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_194357__932.json | 55.0 | missing | missing | missing | |
| 8655 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_194358__990 | 3 | 0.000155 | 0.877857 | 4 | [99, 28] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_194358__990.json | 90.0 | missing | missing | missing | |
| 8656 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_201825__693 | 5 | 0.000149 | 0.903614 | 4 | [99, 25] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_201825__693.json | 100.0 | missing | missing | missing | |
| 8657 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_201826__988 | 5 | 0.000157 | 1.08194 | 4 | [99, 29] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_201826__988.json | 100.0 | missing | missing | missing | |
| 8658 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_195555__676 | 5 | 0.0 | 1.69758 | 4 | [99, 29] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231215_195555__676.json | 100.0 | 0.9 | missing | 0.1 | |
| 8659 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_203252__946 | 5 | 0.000269 | 0.956996 | 4 | [173, 48] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231213_203252__946.json | 100.0 | missing | missing | missing | |
| 8660 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_194355__677 | 5 | 0.000263 | 0.912335 | 4 | [173, 45] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_194355__677.json | 100.0 | missing | missing | missing | |
| 8661 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_194356__282 | 5 | 0.000263 | 1.00966 | 4 | [173, 45] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_194356__282.json | 100.0 | missing | missing | missing | |
| 8662 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_201822__787 | 5 | 0.000255 | 1.38713 | 4 | [173, 41] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_201822__787.json | 100.0 | missing | missing | missing | |
| 8663 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_201824__771 | 0 | 0.000293 | 1.47628 | 0 | [173, 60] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_201824__771.json | 0.0 | missing | missing | missing | |
| 8664 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_195553__887 | 5 | 0.0 | 0.92466 | 4 | [173, 48] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231215_195553__887.json | 100.0 | 0.9 | missing | 0.1 | |
| 8665 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_203304__670 | 3 | 0.000607 | 3.66423 | 4 | [323, 142] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231213_203304__670.json | 90.0 | missing | missing | missing | |
| 8666 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_194409__418 | 0 | 0.000539 | 2.01813 | 0 | [323, 108] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_194409__418.json | 50.0 | missing | missing | missing | |
| 8667 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_194411__475 | 0 | 0.000447 | 1.10104 | 0 | [323, 62] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_194411__475.json | 0.0 | missing | missing | missing | |
| 8668 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_201836__817 | 0 | 0.000467 | 1.53328 | 0 | [323, 72] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_201836__817.json | 0.0 | missing | missing | missing | |
| 8669 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_201838__217 | 5 | 0.000571 | 2.45756 | 4 | [323, 124] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_201838__217.json | 100.0 | missing | missing | missing | |
| 8670 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106--optim | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231215_195603__445 | 0 | 0.0 | 1.49267 | 0 | [323, 63] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231215_195603__445.json | 0.0 | 0.9 | missing | 0.1 | |
| 8671 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_203300__861 | 1 | 0.000382 | 0.69625 | 0 | [322, 30] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231213_203300__861.json | 55.0 | missing | missing | missing | |
| 8672 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_194407__609 | 0 | 0.000446 | 1.42703 | 0 | [322, 62] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_194407__609.json | 25.0 | missing | missing | missing | |
| 8673 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_194407__915 | 3 | 0.000378 | 0.591269 | 4 | [322, 28] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_194407__915.json | 90.0 | missing | missing | missing | |
| 8674 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_201833__109 | 5 | 0.000372 | 1.05303 | 4 | [322, 25] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_201833__109.json | 100.0 | missing | missing | missing | |
| 8675 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_201834__561 | 0 | 0.00046 | 1.49761 | 0 | [322, 69] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_201834__561.json | 0.0 | missing | missing | missing | |
| 8676 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-3.5-turbo-1106--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_195602__821 | 3 | 0.0 | 1.03641 | 4 | [322, 36] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231215_195602__821.json | 90.0 | 0.9 | missing | 0.1 | |
| 8677 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_101604__678 | 5 | 0.00601 | 12.7063 | 4 | [64, 179] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_101604__678.json | 100.0 | missing | missing | missing | |
| 8678 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_101626__179 | 3 | 0.00916 | 21.554 | 4 | [64, 284] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_101626__179.json | 90.0 | missing | missing | missing | |
| 8679 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_101654__465 | 5 | 0.00937 | 28.4979 | 4 | [64, 291] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_101654__465.json | 100.0 | missing | missing | missing | |
| 8680 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_101718__813 | 5 | 0.00976 | 23.9301 | 4 | [64, 304] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_101718__813.json | 100.0 | missing | missing | missing | |
| 8681 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_101733__191 | 5 | 0.00874 | 14.2388 | 4 | [64, 270] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_101733__191.json | 100.0 | missing | missing | missing | |
| 8682 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_101333__371 | 5 | 0.00186 | 3.01941 | 4 | [99, 29] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_101333__371.json | 100.0 | missing | missing | missing | |
| 8683 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_101336__837 | 5 | 0.00186 | 2.85746 | 4 | [99, 29] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_101336__837.json | 100.0 | missing | missing | missing | |
| 8684 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_101339__924 | 5 | 0.00171 | 2.81614 | 4 | [99, 24] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_101339__924.json | 100.0 | missing | missing | missing | |
| 8685 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_101342__443 | 5 | 0.00174 | 2.6793 | 4 | [99, 25] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_101342__443.json | 100.0 | missing | missing | missing | |
| 8686 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_101345__369 | 5 | 0.00186 | 3.50664 | 4 | [99, 29] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_101345__369.json | 100.0 | missing | missing | missing | |
| 8687 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_101216__825 | 3 | 0.00848 | 18.1724 | 4 | [173, 225] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_101216__825.json | 90.0 | missing | missing | missing | |
| 8688 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_101230__237 | 5 | 0.00632 | 14.4812 | 4 | [173, 153] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_101230__237.json | 100.0 | missing | missing | missing | |
| 8689 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_101253__681 | 3 | 0.00935 | 22.3806 | 4 | [173, 254] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_101253__681.json | 90.0 | missing | missing | missing | |
| 8690 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_101307__946 | 0 | 0.0059 | 13.9533 | 0 | [173, 139] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_101307__946.json | 25.0 | missing | missing | missing | |
| 8691 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_101318__797 | 0 | 0.00563 | 10.8395 | 0 | [173, 130] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_101318__797.json | 0.0 | missing | missing | missing | |
| 8692 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_102649__559 | 1 | 0.00959 | 22.1043 | 0 | [323, 212] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_102649__559.json | 55.0 | missing | missing | missing | |
| 8693 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_102728__965 | 5 | 0.01442 | 38.9346 | 4 | [323, 373] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_102728__965.json | 100.0 | missing | missing | missing | |
| 8694 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_102808__846 | 3 | 0.0185 | 39.7601 | 4 | [323, 509] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_102808__846.json | 90.0 | missing | missing | missing | |
| 8695 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_102849__820 | 1 | 0.01559 | 40.2475 | 0 | [323, 412] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_102849__820.json | 55.0 | missing | missing | missing | |
| 8696 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_102925__446 | 3 | 0.01406 | 36.2712 | 4 | [323, 361] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_102925__446.json | 90.0 | missing | missing | missing | |
| 8697 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_102055__432 | 5 | 0.01171 | 22.8221 | 4 | [322, 283] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_102055__432.json | 100.0 | missing | missing | missing | |
| 8698 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_102142__961 | 5 | 0.01762 | 46.9949 | 4 | [322, 480] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_102142__961.json | 100.0 | missing | missing | missing | |
| 8699 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_102232__599 | 5 | 0.01648 | 49.9852 | 4 | [322, 442] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_102232__599.json | 100.0 | missing | missing | missing | |
| 8700 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_102309__547 | 1 | 0.01372 | 36.8856 | 0 | [322, 350] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_102309__547.json | 55.0 | missing | missing | missing | |
| 8701 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_102351__760 | 3 | 0.01429 | 42.2912 | 4 | [322, 369] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_102351__760.json | 90.0 | missing | missing | missing | |
| 8702 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 5 | 20231213_203437__752 | 0 | 0.00763 | 18.2363 | 0 | [61, 234] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231213_203437__752.json | 0.0 | missing | missing | missing | |
| 8703 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 5 | 20231225_194505__949 | 0 | 0.00607 | 6.47573 | 0 | [61, 182] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_194505__949.json | 0.0 | missing | missing | missing | |
| 8704 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 5 | 20231225_194518__376 | 0 | 0.00847 | 12.7463 | 0 | [61, 262] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_194518__376.json | 0.0 | missing | missing | missing | |
| 8705 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview--optim | AsIs | 1SHOT | false | false | 5 | 20231215_195728__683 | 0 | 0.0 | 39.3304 | 0 | [61, 390] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231215_195728__683.json | 0.0 | 0.1 | missing | 0.9 | |
| 8706 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231213_203418__991 | 5 | 0.0121 | 44.3289 | 4 | [64, 382] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231213_203418__991.json | 100.0 | missing | missing | missing | |
| 8707 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231225_194446__131 | 5 | 0.00595 | 9.18609 | 4 | [64, 177] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_194446__131.json | 100.0 | missing | missing | missing | |
| 8708 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231225_194458__260 | 3 | 0.00799 | 12.5192 | 4 | [64, 245] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_194458__260.json | 90.0 | missing | missing | missing | |
| 8709 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231227_202018__141 | 3 | 0.00916 | 24.6392 | 4 | [64, 284] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_202018__141.json | 90.0 | missing | missing | missing | |
| 8710 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231227_202038__999 | 5 | 0.00781 | 19.0237 | 4 | [64, 239] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_202038__999.json | 100.0 | missing | missing | missing | |
| 8711 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview--optim | InJulia | 1SHOT | true | true | 5 | 20231215_195649__619 | 5 | 0.0 | 17.0117 | 4 | [64, 194] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231215_195649__619.json | 100.0 | 0.1 | missing | 0.9 | |
| 8712 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_203334__748 | 5 | 0.00321 | 8.50213 | 4 | [99, 74] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231213_203334__748.json | 100.0 | missing | missing | missing | |
| 8713 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_194434__696 | 5 | 0.00333 | 2.94412 | 4 | [99, 78] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_194434__696.json | 100.0 | missing | missing | missing | |
| 8714 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_194436__385 | 5 | 0.00231 | 1.79963 | 4 | [99, 44] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_194436__385.json | 100.0 | missing | missing | missing | |
| 8715 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_201939__422 | 5 | 0.00171 | 4.44928 | 4 | [99, 24] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_201939__422.json | 100.0 | missing | missing | missing | |
| 8716 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_201954__200 | 3 | 0.00432 | 14.5242 | 4 | [99, 111] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_201954__200.json | 90.0 | missing | missing | missing | |
| 8717 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_195632__717 | 3 | 0.0 | 8.37146 | 4 | [99, 90] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231215_195632__717.json | 90.0 | 0.1 | missing | 0.9 | |
| 8718 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_203325__682 | 3 | 0.00983 | 20.4177 | 4 | [173, 270] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231213_203325__682.json | 90.0 | missing | missing | missing | |
| 8719 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_194419__819 | 3 | 0.00815 | 8.69544 | 4 | [173, 214] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_194419__819.json | 90.0 | missing | missing | missing | |
| 8720 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_194431__662 | 3 | 0.00893 | 11.5748 | 4 | [173, 240] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_194431__662.json | 90.0 | missing | missing | missing | |
| 8721 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_201857__214 | 5 | 0.00935 | 18.4136 | 4 | [173, 254] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_201857__214.json | 100.0 | missing | missing | missing | |
| 8722 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_201934__498 | 5 | 0.00683 | 37.3049 | 4 | [173, 170] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_201934__498.json | 100.0 | missing | missing | missing | |
| 8723 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_195623__521 | 3 | 0.0 | 19.5282 | 4 | [173, 234] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231215_195623__521.json | 90.0 | 0.1 | missing | 0.9 | |
| 8724 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_203522__604 | 4 | 0.01151 | 20.8077 | 4 | [323, 276] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231213_203522__604.json | 95.0 | missing | missing | missing | |
| 8725 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_194558__449 | 5 | 0.0137 | 13.9652 | 4 | [323, 349] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_194558__449.json | 100.0 | missing | missing | missing | |
| 8726 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_194610__819 | 4 | 0.0092 | 12.295 | 4 | [323, 199] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_194610__819.json | 95.0 | missing | missing | missing | |
| 8727 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_202133__446 | 3 | 0.01094 | 16.0909 | 4 | [323, 257] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_202133__446.json | 90.0 | missing | missing | missing | |
| 8728 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_202143__874 | 3 | 0.00824 | 9.85309 | 4 | [323, 167] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_202143__874.json | 90.0 | missing | missing | missing | |
| 8729 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_195818__193 | 1 | 0.0 | 24.1129 | 0 | [323, 333] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231215_195818__193.json | 55.0 | 0.1 | missing | 0.9 | |
| 8730 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_203501__584 | 5 | 0.01177 | 23.6456 | 4 | [322, 285] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231213_203501__584.json | 100.0 | missing | missing | missing | |
| 8731 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_194528__952 | 1 | 0.00832 | 10.7703 | 0 | [322, 170] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_194528__952.json | 55.0 | missing | missing | missing | |
| 8732 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_194544__410 | 5 | 0.01252 | 15.0117 | 4 | [322, 310] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_194544__410.json | 100.0 | missing | missing | missing | |
| 8733 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_202106__151 | 1 | 0.01501 | 28.3364 | 0 | [322, 393] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_202106__151.json | 55.0 | missing | missing | missing | |
| 8734 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_202117__495 | 1 | 0.00913 | 10.9414 | 0 | [322, 197] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_202117__495.json | 55.0 | missing | missing | missing | |
| 8735 | Apple-MacBook-Pro-M1 | keep_only_names | gpt-4-1106-preview--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_195754__540 | 1 | 0.0 | 25.9139 | 0 | [322, 304] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231215_195754__540.json | 55.0 | 0.1 | missing | 0.9 | |
| 8736 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | AsIs | 1SHOT | false | false | 5 | 20231214_075157__128 | 0 | 0.0 | 9.20086 | 0 | [52, 279] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__AsIs__1SHOT__20231214_075157__128.json | 0.0 | missing | missing | missing | |
| 8737 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | AsIs | 1SHOT | false | false | 5 | 20231225_114000__353 | 0 | 0.0 | 9.56464 | 0 | [52, 292] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__AsIs__1SHOT__20231225_114000__353.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8738 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | AsIs | 1SHOT | false | false | 5 | 20231225_114010__612 | 0 | 0.0 | 10.0755 | 0 | [1, 319] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__AsIs__1SHOT__20231225_114010__612.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8739 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | InJulia | 1SHOT | true | true | 5 | 20231214_075147__232 | 1 | 0.0 | 9.14014 | 0 | [69, 271] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__InJulia__1SHOT__20231214_075147__232.json | 55.0 | missing | missing | missing | |
| 8740 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | InJulia | 1SHOT | false | false | 5 | 20231225_113939__287 | 0 | 0.0 | 11.0209 | 0 | [69, 331] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__InJulia__1SHOT__20231225_113939__287.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8741 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | InJulia | 1SHOT | true | true | 5 | 20231225_113950__879 | 3 | 0.0 | 10.8064 | 4 | [1, 341] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__InJulia__1SHOT__20231225_113950__879.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8742 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | InJulia | 1SHOT | true | true | 5 | 20231227_011538__125 | 0 | 0.0 | 7.97343 | 4 | [69, 241] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__InJulia__1SHOT__20231227_011538__125.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8743 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_075138__953 | 1 | 0.0 | 6.77455 | 0 | [98, 190] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaExpertAsk__1SHOT__20231214_075138__953.json | 55.0 | missing | missing | missing | |
| 8744 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_113920__716 | 1 | 0.0 | 8.37718 | 0 | [98, 241] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_113920__716.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8745 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_113928__561 | 1 | 0.0 | 7.45368 | 0 | [1, 235] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_113928__561.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8746 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_011530__233 | 2 | 0.0 | 7.76225 | 4 | [98, 225] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaExpertAsk__1SHOT__20231227_011530__233.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8747 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_075131__327 | 1 | 0.0 | 11.0089 | 0 | [188, 288] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_075131__327.json | 55.0 | missing | missing | missing | |
| 8748 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_113904__172 | 1 | 0.0 | 20.2761 | 0 | [206, 398] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_113904__172.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8749 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_113912__726 | 1 | 0.0 | 7.32642 | 0 | [1, 224] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_113912__726.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8750 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_011522__320 | 1 | 0.0 | 19.4021 | 0 | [206, 401] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_011522__320.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8751 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_075249__867 | 0 | 0.0 | 30.8058 | 0 | [11, 806] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_075249__867.json | 50.0 | missing | missing | missing | |
| 8752 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_114107__648 | 0 | 0.0 | 16.4183 | 0 | [11, 452] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_114107__648.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8753 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_114140__671 | 0 | 0.0 | 32.8506 | 0 | [1, 862] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_114140__671.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8754 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_011617__163 | 0 | 0.0 | 18.3149 | 0 | [11, 506] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_011617__163.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8755 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_075219__337 | 0 | 0.0 | 21.922 | 0 | [369, 512] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaRecapTask__1SHOT__20231214_075219__337.json | 25.0 | missing | missing | missing | |
| 8756 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_114029__281 | 0 | 0.0 | 19.2851 | 0 | [369, 447] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_114029__281.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8757 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_114050__407 | 2 | 0.0 | 20.8201 | 4 | [1, 571] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_114050__407.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8758 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_011558__697 | 1 | 0.0 | 20.139 | 4 | [369, 473] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaRecapTask__1SHOT__20231227_011558__697.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8759 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | AsIs | 1SHOT | false | false | 5 | 20231214_075934__182 | 0 | 0.0 | 8.49155 | 0 | [52, 256] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__AsIs__1SHOT__20231214_075934__182.json | 0.0 | missing | missing | missing | |
| 8760 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | AsIs | 1SHOT | false | false | 5 | 20231225_115621__363 | 0 | 0.0 | 3.99574 | 0 | [66, 124] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__AsIs__1SHOT__20231225_115621__363.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8761 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | AsIs | 1SHOT | false | false | 5 | 20231225_115625__712 | 0 | 0.0 | 4.17187 | 0 | [66, 131] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__AsIs__1SHOT__20231225_115625__712.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8762 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | InJulia | 1SHOT | true | true | 5 | 20231214_075925__727 | 1 | 0.0 | 10.7911 | 0 | [69, 322] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__InJulia__1SHOT__20231214_075925__727.json | 55.0 | missing | missing | missing | |
| 8763 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_115611__129 | 5 | 0.0 | 3.87066 | 4 | [69, 120] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__InJulia__1SHOT__20231225_115611__129.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8764 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_115617__699 | 5 | 0.0 | 5.7648 | 4 | [69, 186] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__InJulia__1SHOT__20231225_115617__699.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8765 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | InJulia | 1SHOT | true | true | 5 | 20231227_012301__545 | 5 | 0.0 | 7.67375 | 4 | [69, 249] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__InJulia__1SHOT__20231227_012301__545.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8766 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_075914__609 | 1 | 0.0 | 8.57943 | 0 | [98, 246] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231214_075914__609.json | 55.0 | missing | missing | missing | |
| 8767 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_115559__493 | 1 | 0.0 | 7.70762 | 0 | [108, 247] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_115559__493.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8768 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_115607__814 | 5 | 0.0 | 7.12072 | 4 | [108, 225] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_115607__814.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8769 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_012253__535 | 5 | 0.0 | 6.41564 | 4 | [108, 201] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231227_012253__535.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8770 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_075906__373 | 0 | 0.0 | 14.1637 | 0 | [188, 380] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231214_075906__373.json | 0.0 | missing | missing | missing | |
| 8771 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_115545__755 | 1 | 0.0 | 12.9754 | 4 | [198, 207] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_115545__755.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8772 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_115552__214 | 5 | 0.0 | 6.03938 | 4 | [198, 174] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_115552__214.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8773 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_012247__408 | 5 | 0.0 | 12.5072 | 4 | [198, 196] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231227_012247__408.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8774 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_080019__909 | 1 | 0.0 | 17.8886 | 4 | [11, 488] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231214_080019__909.json | 80.0 | missing | missing | missing | |
| 8775 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_115644__645 | 5 | 0.0 | 6.42166 | 4 | [372, 159] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_115644__645.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8776 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_115657__683 | 1 | 0.0 | 13.0512 | 0 | [372, 372] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_115657__683.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8777 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_012318__322 | 0 | 0.0 | 8.25777 | 0 | [372, 218] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231227_012318__322.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8778 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_080001__644 | 0 | 0.0 | 27.6734 | 0 | [369, 656] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaRecapTask__1SHOT__20231214_080001__644.json | 50.0 | missing | missing | missing | |
| 8779 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_115632__599 | 5 | 0.0 | 6.90201 | 4 | [369, 175] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_115632__599.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8780 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_115638__214 | 3 | 0.0 | 5.04292 | 4 | [369, 114] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_115638__214.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8781 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_012310__650 | 5 | 0.0 | 8.13264 | 4 | [369, 214] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaRecapTask__1SHOT__20231227_012310__650.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8782 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_181743__639 | 5 | 0.0 | 11.331 | 4 | [69, 207] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_181743__639.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8783 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_181755__859 | 5 | 0.0 | 11.6128 | 4 | [69, 208] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_181755__859.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8784 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_181805__878 | 5 | 0.0 | 10.0143 | 4 | [69, 180] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_181805__878.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8785 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_181716__581 | 1 | 0.0 | 12.4516 | 4 | [108, 222] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_181716__581.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8786 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_181725__970 | 5 | 0.0 | 8.43832 | 4 | [108, 145] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_181725__970.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8787 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_181732__789 | 1 | 0.0 | 6.9484 | 0 | [108, 119] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_181732__789.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8788 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_181641__913 | 5 | 0.0 | 11.7527 | 4 | [198, 192] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_181641__913.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8789 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_181651__526 | 5 | 0.0 | 10.184 | 4 | [198, 170] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_181651__526.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8790 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_181703__902 | 5 | 0.0 | 12.1288 | 4 | [198, 202] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_181703__902.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8791 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_181856__900 | 5 | 0.0 | 14.5329 | 4 | [372, 238] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_181856__900.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8792 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_181905__266 | 1 | 0.0 | 8.94161 | 0 | [372, 133] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_181905__266.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8793 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_181914__951 | 1 | 0.0 | 8.04417 | 0 | [372, 116] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_181914__951.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8794 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_181818__688 | 1 | 0.0 | 12.4553 | 0 | [369, 203] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_181818__688.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8795 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_181832__401 | 0 | 0.0 | 14.2693 | 0 | [369, 228] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_181832__401.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8796 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_181842__770 | 5 | 0.0 | 9.19414 | 4 | [369, 133] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_181842__770.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8797 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | AsIs | 1SHOT | false | false | 5 | 20231213_203639__962 | 0 | 0.00160473 | 4.00034 | 0 | [64, 177] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__AsIs__1SHOT__20231213_203639__962.json | 0.0 | missing | missing | missing | |
| 8798 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | AsIs | 1SHOT | false | false | 5 | 20231225_194846__642 | 0 | 0.0042178 | 11.1342 | 0 | [64, 500] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__AsIs__1SHOT__20231225_194846__642.json | 0.0 | missing | missing | missing | |
| 8799 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | AsIs | 1SHOT | false | false | 5 | 20231225_194851__906 | 0 | 0.0019526 | 4.91017 | 0 | [64, 220] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__AsIs__1SHOT__20231225_194851__906.json | 0.0 | missing | missing | missing | |
| 8800 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium--optim | AsIs | 1SHOT | false | false | 5 | 20231215_195956__870 | 0 | 0.0 | 16.9903 | 0 | [64, 217] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__AsIs__1SHOT__20231215_195956__870.json | 0.0 | 0.9 | missing | 0.3 | |
| 8801 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231213_203635__119 | 5 | 0.00347353 | 9.1096 | 4 | [67, 407] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__InJulia__1SHOT__20231213_203635__119.json | 100.0 | missing | missing | missing | |
| 8802 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231225_194828__909 | 1 | 0.00166137 | 4.17303 | 0 | [67, 183] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__InJulia__1SHOT__20231225_194828__909.json | 55.0 | missing | missing | missing | |
| 8803 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231225_194835__586 | 4 | 0.00277779 | 7.16999 | 4 | [67, 321] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__InJulia__1SHOT__20231225_194835__586.json | 95.0 | missing | missing | missing | |
| 8804 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231227_202332__294 | 1 | 0.00258363 | 6.64718 | 0 | [67, 297] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__InJulia__1SHOT__20231227_202332__294.json | 55.0 | missing | missing | missing | |
| 8805 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231227_202343__530 | 5 | 0.00403174 | 10.6283 | 4 | [67, 476] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__InJulia__1SHOT__20231227_202343__530.json | 100.0 | missing | missing | missing | |
| 8806 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium--optim | InJulia | 1SHOT | true | true | 5 | 20231215_195939__973 | 5 | 0.0 | 22.3035 | 4 | [67, 396] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__InJulia__1SHOT__20231215_195939__973.json | 100.0 | 0.9 | missing | 0.3 | |
| 8807 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_203625__602 | 1 | 0.00137835 | 4.54727 | 0 | [106, 135] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231213_203625__602.json | 55.0 | missing | missing | missing | |
| 8808 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_194820__727 | 1 | 0.00227634 | 5.51022 | 0 | [106, 246] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_194820__727.json | 55.0 | missing | missing | missing | |
| 8809 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_194823__124 | 5 | 0.00143498 | 3.24627 | 4 | [106, 142] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_194823__124.json | 100.0 | missing | missing | missing | |
| 8810 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_202321__714 | 1 | 0.00211454 | 5.28237 | 0 | [106, 226] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_202321__714.json | 55.0 | missing | missing | missing | |
| 8811 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_202325__982 | 5 | 0.00159678 | 3.68448 | 4 | [106, 162] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_202325__982.json | 100.0 | missing | missing | missing | |
| 8812 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_195916__797 | 1 | 0.0 | 3.21571 | 0 | [106, 145] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231215_195916__797.json | 55.0 | 0.9 | missing | 0.3 | |
| 8813 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_203621__868 | 5 | 0.00169416 | 11.0647 | 4 | [196, 144] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231213_203621__868.json | 100.0 | missing | missing | missing | |
| 8814 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_194808__987 | 1 | 0.00366812 | 8.75636 | 0 | [196, 388] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_194808__987.json | 55.0 | missing | missing | missing | |
| 8815 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_194814__875 | 1 | 0.00259215 | 5.73949 | 0 | [196, 255] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_194814__875.json | 55.0 | missing | missing | missing | |
| 8816 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_202309__410 | 5 | 0.00484117 | 12.0693 | 4 | [196, 533] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_202309__410.json | 100.0 | missing | missing | missing | |
| 8817 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_202316__860 | 1 | 0.00299665 | 6.982 | 0 | [196, 305] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_202316__860.json | 55.0 | missing | missing | missing | |
| 8818 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_195913__561 | 1 | 0.0 | 8.22819 | 0 | [196, 373] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231215_195913__561.json | 55.0 | 0.9 | missing | 0.3 | |
| 8819 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_203723__566 | 4 | 0.00436174 | 32.6067 | 4 | [369, 416] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231213_203723__566.json | 95.0 | missing | missing | missing | |
| 8820 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_194952__195 | 0 | 0.00449118 | 16.8876 | 0 | [369, 432] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_194952__195.json | 50.0 | missing | missing | missing | |
| 8821 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_195006__310 | 0 | 0.00403814 | 14.0067 | 0 | [369, 376] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_195006__310.json | 50.0 | missing | missing | missing | |
| 8822 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_202419__361 | 0 | 0.00271947 | 10.3322 | 0 | [369, 213] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_202419__361.json | 0.0 | missing | missing | missing | |
| 8823 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_202447__276 | 1 | 0.00454781 | 27.2827 | 0 | [369, 439] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_202447__276.json | 55.0 | missing | missing | missing | |
| 8824 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_200218__706 | 4 | 0.0 | 22.1282 | 4 | [369, 466] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231215_200218__706.json | 95.0 | 0.9 | missing | 0.3 | |
| 8825 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | JuliaRecapTask | 1SHOT | true | false | 5 | 20231213_203650__738 | 0 | 0.00508174 | 11.5657 | 0 | [366, 506] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231213_203650__738.json | 25.0 | missing | missing | missing | |
| 8826 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_194901__671 | 4 | 0.00458825 | 10.0966 | 4 | [366, 445] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_194901__671.json | 95.0 | missing | missing | missing | |
| 8827 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_194935__265 | 0 | 0.00804268 | 33.3091 | 0 | [366, 872] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_194935__265.json | 50.0 | missing | missing | missing | |
| 8828 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_202351__116 | 0 | 0.00379543 | 8.46975 | 0 | [366, 347] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_202351__116.json | 50.0 | missing | missing | missing | |
| 8829 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_202409__662 | 0 | 0.00504938 | 17.5084 | 0 | [366, 502] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_202409__662.json | 50.0 | missing | missing | missing | |
| 8830 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | AsIs | 1SHOT | false | false | 5 | 20231213_203602__753 | 0 | 0.000222475 | 1.41871 | 0 | [65, 93] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__AsIs__1SHOT__20231213_203602__753.json | 0.0 | missing | missing | missing | |
| 8831 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | AsIs | 1SHOT | false | false | 5 | 20231225_194726__196 | 0 | 0.000274855 | 1.81609 | 0 | [65, 120] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__AsIs__1SHOT__20231225_194726__196.json | 0.0 | missing | missing | missing | |
| 8832 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | AsIs | 1SHOT | false | false | 5 | 20231225_194728__498 | 0 | 0.000317535 | 2.45231 | 0 | [65, 142] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__AsIs__1SHOT__20231225_194728__498.json | 0.0 | missing | missing | missing | |
| 8833 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small--optim | AsIs | 1SHOT | false | false | 5 | 20231215_195850__228 | 0 | 0.0 | 1.39042 | 0 | [65, 93] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__AsIs__1SHOT__20231215_195850__228.json | 0.0 | 0.9 | missing | 0.3 | |
| 8834 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231213_203601__343 | 1 | 0.000430056 | 2.76582 | 0 | [68, 199] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__InJulia__1SHOT__20231213_203601__343.json | 55.0 | missing | missing | missing | |
| 8835 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231225_194717__219 | 5 | 0.000684196 | 12.6436 | 4 | [68, 330] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__InJulia__1SHOT__20231225_194717__219.json | 100.0 | missing | missing | missing | |
| 8836 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231225_194724__335 | 5 | 0.000895656 | 6.4346 | 4 | [68, 439] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__InJulia__1SHOT__20231225_194724__335.json | 100.0 | missing | missing | missing | |
| 8837 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231227_202232__693 | 1 | 0.000558096 | 3.58994 | 0 | [68, 265] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__InJulia__1SHOT__20231227_202232__693.json | 55.0 | missing | missing | missing | |
| 8838 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231227_202235__473 | 1 | 0.000366036 | 2.70989 | 0 | [68, 166] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__InJulia__1SHOT__20231227_202235__473.json | 55.0 | missing | missing | missing | |
| 8839 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small--optim | InJulia | 1SHOT | true | true | 5 | 20231215_195848__960 | 1 | 0.0 | 4.24616 | 0 | [68, 316] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__InJulia__1SHOT__20231215_195848__960.json | 55.0 | 0.9 | missing | 0.3 | |
| 8840 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_203558__191 | 5 | 0.000596263 | 3.69605 | 4 | [109, 271] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231213_203558__191.json | 100.0 | missing | missing | missing | |
| 8841 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_194700__404 | 2 | 0.000732063 | 4.65164 | 4 | [109, 341] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_194700__404.json | 85.0 | missing | missing | missing | |
| 8842 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_194704__771 | 1 | 0.000677743 | 4.26225 | 0 | [109, 313] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_194704__771.json | 55.0 | missing | missing | missing | |
| 8843 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_202223__181 | 0 | 0.000353763 | 2.29216 | 0 | [109, 146] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_202223__181.json | 50.0 | missing | missing | missing | |
| 8844 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_202229__561 | 5 | 0.000906663 | 5.75367 | 4 | [109, 431] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_202229__561.json | 100.0 | missing | missing | missing | |
| 8845 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_195844__326 | 5 | 0.0 | 3.65311 | 4 | [109, 275] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231215_195844__326.json | 100.0 | 0.9 | missing | 0.3 | |
| 8846 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_203554__881 | 5 | 0.000602113 | 3.33434 | 4 | [199, 244] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231213_203554__881.json | 100.0 | missing | missing | missing | |
| 8847 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_194649__529 | 5 | 0.000588533 | 3.3111 | 4 | [199, 237] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_194649__529.json | 100.0 | missing | missing | missing | |
| 8848 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_194655__614 | 1 | 0.000883413 | 5.28758 | 0 | [199, 389] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_194655__614.json | 55.0 | missing | missing | missing | |
| 8849 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_202216__713 | 5 | 0.000549733 | 3.58361 | 4 | [199, 217] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_202216__713.json | 100.0 | missing | missing | missing | |
| 8850 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_202220__326 | 5 | 0.000735973 | 4.28695 | 4 | [199, 313] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_202220__326.json | 100.0 | missing | missing | missing | |
| 8851 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_195840__813 | 5 | 0.0 | 4.51071 | 4 | [199, 335] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231215_195840__813.json | 100.0 | 0.9 | missing | 0.3 | |
| 8852 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_203609__733 | 1 | 0.000731505 | 3.45168 | 4 | [375, 252] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231213_203609__733.json | 80.0 | missing | missing | missing | |
| 8853 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_194741__391 | 5 | 0.000964305 | 5.16545 | 4 | [375, 372] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_194741__391.json | 100.0 | missing | missing | missing | |
| 8854 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_194759__998 | 5 | 0.000919685 | 17.9624 | 4 | [375, 349] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_194759__998.json | 100.0 | missing | missing | missing | |
| 8855 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_202250__501 | 5 | 0.00133873 | 7.58154 | 4 | [375, 565] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_202250__501.json | 100.0 | missing | missing | missing | |
| 8856 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_202257__801 | 1 | 0.00109234 | 6.06112 | 0 | [375, 438] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_202257__801.json | 55.0 | missing | missing | missing | |
| 8857 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_195904__963 | 5 | 0.0 | 8.7232 | 4 | [375, 654] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231215_195904__963.json | 100.0 | 0.9 | missing | 0.3 | |
| 8858 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_203606__259 | 1 | 0.000714691 | 3.38437 | 0 | [373, 244] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231213_203606__259.json | 55.0 | missing | missing | missing | |
| 8859 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_194732__702 | 1 | 0.000739911 | 3.5124 | 0 | [373, 257] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_194732__702.json | 55.0 | missing | missing | missing | |
| 8860 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_194735__660 | 1 | 0.000736031 | 3.59853 | 0 | [373, 255] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_194735__660.json | 55.0 | missing | missing | missing | |
| 8861 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_202239__382 | 0 | 0.000699171 | 3.31703 | 0 | [373, 236] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_202239__382.json | 50.0 | missing | missing | missing | |
| 8862 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_202242__593 | 1 | 0.000765131 | 3.76178 | 0 | [373, 270] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_202242__593.json | 55.0 | missing | missing | missing | |
| 8863 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-small--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_195856__637 | 5 | 0.0 | 5.65771 | 4 | [373, 424] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231215_195856__637.json | 100.0 | 0.9 | missing | 0.3 | |
| 8864 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231213_203540__499 | 0 | 0.000107401 | 2.33794 | 0 | [65, 217] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__AsIs__1SHOT__20231213_203540__499.json | 0.0 | missing | missing | missing | |
| 8865 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231225_194632__786 | 0 | 0.000173539 | 3.20403 | 0 | [65, 363] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__AsIs__1SHOT__20231225_194632__786.json | 0.0 | missing | missing | missing | |
| 8866 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231225_194635__398 | 0 | 0.00012235 | 2.20553 | 0 | [65, 250] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__AsIs__1SHOT__20231225_194635__398.json | 0.0 | missing | missing | missing | |
| 8867 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny--optim | AsIs | 1SHOT | false | false | 5 | 20231215_195829__149 | 0 | 0.0 | 1.99915 | 0 | [65, 234] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__AsIs__1SHOT__20231215_195829__149.json | 0.0 | 0.9 | missing | 0.3 | |
| 8868 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231213_203538__757 | 0 | 0.000225601 | 7.73251 | 0 | [68, 477] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__InJulia__1SHOT__20231213_203538__757.json | 50.0 | missing | missing | missing | |
| 8869 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231225_194625__791 | 3 | 0.000102385 | 1.84942 | 4 | [68, 205] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__InJulia__1SHOT__20231225_194625__791.json | 90.0 | missing | missing | missing | |
| 8870 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231225_194629__200 | 0 | 0.000178489 | 3.29583 | 4 | [68, 373] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__InJulia__1SHOT__20231225_194629__200.json | 75.0 | missing | missing | missing | |
| 8871 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231227_202159__849 | 0 | 0.000126394 | 2.64179 | 0 | [68, 258] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__InJulia__1SHOT__20231227_202159__849.json | 50.0 | missing | missing | missing | |
| 8872 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny--optim | InJulia | 1SHOT | true | true | 5 | 20231215_195827__120 | 0 | 0.0 | 1.88742 | 0 | [68, 222] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__InJulia__1SHOT__20231215_195827__120.json | 50.0 | 0.9 | missing | 0.3 | |
| 8873 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_203530__187 | 0 | 3.6098e-5 | 0.917232 | 0 | [109, 46] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231213_203530__187.json | 50.0 | missing | missing | missing | |
| 8874 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_194622__707 | 0 | 9.4535e-5 | 1.59756 | 0 | [109, 175] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_194622__707.json | 50.0 | missing | missing | missing | |
| 8875 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_194623__483 | 0 | 7.1432e-5 | 1.16702 | 0 | [109, 124] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_194623__483.json | 50.0 | missing | missing | missing | |
| 8876 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_202155__380 | 0 | 4.697e-5 | 0.807443 | 4 | [109, 70] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_202155__380.json | 75.0 | missing | missing | missing | |
| 8877 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_202156__478 | 0 | 3.6551e-5 | 0.604539 | 0 | [109, 47] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_202156__478.json | 50.0 | missing | missing | missing | |
| 8878 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_195825__210 | 0 | 0.0 | 0.583083 | 0 | [109, 46] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231215_195825__210.json | 50.0 | 0.9 | missing | 0.3 | |
| 8879 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_203529__490 | 0 | 0.000165119 | 6.99551 | 0 | [199, 303] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231213_203529__490.json | 50.0 | missing | missing | missing | |
| 8880 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_194618__858 | 0 | 0.00016376 | 7.40639 | 0 | [199, 300] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_194618__858.json | 0.0 | missing | missing | missing | |
| 8881 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_194621__509 | 0 | 0.000159683 | 2.65567 | 0 | [199, 291] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_194621__509.json | 0.0 | missing | missing | missing | |
| 8882 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_202152__218 | 1 | 0.000143375 | 8.38604 | 4 | [199, 255] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_202152__218.json | 80.0 | missing | missing | missing | |
| 8883 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_202154__677 | 0 | 0.000152888 | 2.5069 | 0 | [199, 276] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_202154__677.json | 50.0 | missing | missing | missing | |
| 8884 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_195824__842 | 0 | 0.0 | 5.96871 | 0 | [199, 268] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231215_195824__842.json | 50.0 | 0.9 | missing | 0.3 | |
| 8885 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_203551__802 | 0 | 0.000204708 | 4.68817 | 0 | [375, 336] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231213_203551__802.json | 50.0 | missing | missing | missing | |
| 8886 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_194644__939 | 0 | 0.000144459 | 2.05438 | 0 | [375, 203] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_194644__939.json | 50.0 | missing | missing | missing | |
| 8887 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_194646__881 | 1 | 0.000164391 | 2.23269 | 0 | [375, 247] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_194646__881.json | 55.0 | missing | missing | missing | |
| 8888 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_202208__885 | 0 | 0.000207879 | 3.16996 | 0 | [375, 343] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_202208__885.json | 50.0 | missing | missing | missing | |
| 8889 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_202212__343 | 0 | 0.000254085 | 3.98409 | 0 | [375, 445] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_202212__343.json | 50.0 | missing | missing | missing | |
| 8890 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_195836__882 | 0 | 0.0 | 2.54544 | 0 | [375, 303] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231215_195836__882.json | 50.0 | 0.9 | missing | 0.3 | |
| 8891 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_203546__256 | 0 | 0.000190838 | 5.62853 | 0 | [373, 306] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231213_203546__256.json | 50.0 | missing | missing | missing | |
| 8892 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_194638__267 | 0 | 0.0002153 | 3.32223 | 0 | [373, 360] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_194638__267.json | 50.0 | missing | missing | missing | |
| 8893 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_194641__652 | 0 | 0.000223907 | 3.48982 | 0 | [373, 379] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_194641__652.json | 50.0 | missing | missing | missing | |
| 8894 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_202202__211 | 0 | 0.000195368 | 2.83158 | 0 | [373, 316] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_202202__211.json | 50.0 | missing | missing | missing | |
| 8895 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_202205__504 | 0 | 0.000208505 | 3.08121 | 0 | [373, 345] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_202205__504.json | 50.0 | missing | missing | missing | |
| 8896 | Apple-MacBook-Pro-M1 | keep_only_names | mistral-tiny--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_195833__915 | 0 | 0.0 | 3.72472 | 0 | [373, 438] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231215_195833__915.json | 50.0 | 0.9 | missing | 0.3 | |
| 8897 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_121859__818 | 0 | 0.0 | 8.35379 | 0 | [64, 211] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_121859__818.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8898 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_121905__917 | 0 | 0.0 | 6.06067 | 0 | [64, 151] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_121905__917.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8899 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_121844__836 | 0 | 0.0 | 5.88966 | 0 | [67, 143] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_121844__836.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8900 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_121851__254 | 3 | 0.0 | 6.24507 | 4 | [67, 152] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_121851__254.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8901 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_013322__878 | 3 | 0.0 | 5.85527 | 4 | [67, 141] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_013322__878.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8902 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_121836__459 | 1 | 0.0 | 2.03387 | 0 | [108, 36] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_121836__459.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8903 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_121838__244 | 0 | 0.0 | 2.00773 | 0 | [108, 35] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_121838__244.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8904 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_013316__556 | 3 | 0.0 | 2.38388 | 4 | [108, 45] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_013316__556.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8905 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_121827__746 | 1 | 0.0 | 15.7963 | 0 | [198, 236] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_121827__746.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8906 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_121834__226 | 0 | 0.0 | 7.05053 | 0 | [198, 154] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_121834__226.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8907 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_013313__283 | 3 | 0.0 | 17.9174 | 4 | [198, 294] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_013313__283.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8908 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_121940__506 | 1 | 0.0 | 16.0412 | 0 | [375, 356] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_121940__506.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8909 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_121954__681 | 0 | 0.0 | 14.0419 | 0 | [375, 306] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_121954__681.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8910 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_013346__976 | 0 | 0.0 | 13.3496 | 0 | [375, 287] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_013346__976.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8911 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_121916__980 | 1 | 0.0 | 10.7637 | 0 | [373, 224] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_121916__980.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8912 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_121923__605 | 1 | 0.0 | 7.27939 | 0 | [373, 136] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_121923__605.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8913 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_013333__653 | 0 | 0.0 | 10.421 | 0 | [373, 214] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_013333__653.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8914 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_000117__749 | 0 | 0.0 | 6.72073 | 4 | [66, 209] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_000117__749.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8915 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_000125__870 | 0 | 0.0 | 7.91929 | 0 | [66, 248] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_000125__870.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8916 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | false | false | 5 | 20231228_000138__578 | 0 | 0.0 | 12.6277 | 0 | [66, 401] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_000138__578.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8917 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | false | false | 5 | 20231228_000147__265 | 0 | 0.0 | 8.94519 | 0 | [66, 282] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_000147__265.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8918 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | false | false | 5 | 20231228_000156__782 | 0 | 0.0 | 9.09851 | 0 | [66, 287] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_000156__782.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8919 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_000100__936 | 1 | 0.0 | 1.83867 | 0 | [107, 41] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_000100__936.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8920 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_000105__216 | 1 | 0.0 | 4.1842 | 0 | [107, 120] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_000105__216.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8921 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231228_000107__494 | 0 | 0.0 | 1.7923 | 0 | [107, 40] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_000107__494.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8922 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231228_000108__361 | 0 | 0.0 | 1.8199 | 0 | [107, 41] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_000108__361.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8923 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_000110__106 | 0 | 0.0 | 1.98161 | 0 | [107, 46] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_000110__106.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8924 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231228_000018__111 | 0 | 0.0 | 12.7239 | 0 | [197, 355] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000018__111.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8925 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231228_000029__689 | 0 | 0.0 | 10.6899 | 0 | [197, 315] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000029__689.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8926 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_000040__695 | 0 | 0.0 | 11.1002 | 4 | [197, 324] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000040__695.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8927 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_000050__246 | 0 | 0.0 | 10.016 | 0 | [197, 293] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000050__246.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8928 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231228_000058__415 | 0 | 0.0 | 8.46703 | 0 | [197, 243] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000058__415.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8929 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_000305__613 | 2 | 0.0 | 8.05353 | 4 | [374, 202] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_000305__613.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8930 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_000317__985 | 0 | 0.0 | 11.4826 | 0 | [374, 309] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_000317__985.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8931 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_000327__417 | 1 | 0.0 | 10.4418 | 0 | [374, 277] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_000327__417.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8932 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231228_000337__849 | 0 | 0.0 | 9.6749 | 0 | [374, 253] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_000337__849.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8933 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_000351__910 | 0 | 0.0 | 14.1706 | 0 | [374, 392] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_000351__910.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8934 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_000206__875 | 0 | 0.0 | 9.84195 | 0 | [372, 258] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_000206__875.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8935 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_000221__870 | 4 | 0.0 | 15.3997 | 4 | [372, 429] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_000221__870.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8936 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_000236__783 | 0 | 0.0 | 14.0683 | 0 | [372, 389] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_000236__783.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8937 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231228_000246__335 | 0 | 0.0 | 10.6677 | 0 | [372, 284] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_000246__335.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8938 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_000257__560 | 0 | 0.0 | 10.5736 | 0 | [372, 280] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_000257__560.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8939 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_000536__999 | 0 | 0.0 | 8.1674 | 0 | [66, 200] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_000536__999.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8940 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231228_000544__683 | 0 | 0.0 | 7.37635 | 0 | [66, 180] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_000544__683.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8941 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_000553__108 | 0 | 0.0 | 9.61419 | 0 | [66, 238] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_000553__108.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8942 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_000603__759 | 0 | 0.0 | 9.35721 | 0 | [66, 231] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_000603__759.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8943 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_000611__703 | 0 | 0.0 | 8.12283 | 0 | [66, 199] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_000611__703.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8944 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_000514__108 | 0 | 0.0 | 3.09153 | 0 | [107, 63] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_000514__108.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8945 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_000519__441 | 1 | 0.0 | 4.95916 | 0 | [107, 112] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_000519__441.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8946 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_000522__628 | 0 | 0.0 | 2.98089 | 0 | [107, 60] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_000522__628.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8947 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_000525__174 | 0 | 0.0 | 2.93668 | 0 | [107, 59] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_000525__174.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8948 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_000528__325 | 0 | 0.0 | 3.01248 | 0 | [107, 61] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_000528__325.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8949 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_000404__562 | 0 | 0.0 | 12.8939 | 0 | [197, 280] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000404__562.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8950 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_000422__471 | 1 | 0.0 | 18.1339 | 0 | [197, 432] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000422__471.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8951 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231228_000439__627 | 0 | 0.0 | 16.4079 | 0 | [197, 389] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000439__627.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8952 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231228_000452__109 | 0 | 0.0 | 12.8866 | 0 | [197, 301] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000452__109.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8953 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_000510__185 | 3 | 0.0 | 18.6899 | 4 | [197, 446] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000510__185.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8954 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_000729__368 | 0 | 0.0 | 11.2689 | 0 | [374, 234] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_000729__368.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8955 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_000759__730 | 1 | 0.0 | 29.9657 | 0 | [374, 687] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_000759__730.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8956 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_000815__481 | 0 | 0.0 | 15.8855 | 0 | [374, 348] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_000815__481.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8957 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231228_000827__779 | 0 | 0.0 | 11.8608 | 0 | [374, 249] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_000827__779.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8958 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_000841__816 | 0 | 0.0 | 13.7785 | 0 | [374, 296] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_000841__816.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8959 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_000625__822 | 0 | 0.0 | 13.5743 | 0 | [372, 291] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_000625__822.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8960 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_000640__288 | 0 | 0.0 | 15.1864 | 0 | [372, 331] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_000640__288.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8961 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_000654__863 | 0 | 0.0 | 14.2143 | 0 | [372, 307] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_000654__863.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8962 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_000704__981 | 0 | 0.0 | 9.8181 | 0 | [372, 198] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_000704__981.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8963 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_000718__526 | 1 | 0.0 | 13.5964 | 0 | [372, 292] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_000718__526.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8964 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231226_122814__540 | 0 | 0.0 | 16.4738 | 0 | [63, 305] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_122814__540.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8965 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231226_122826__785 | 0 | 0.0 | 11.9878 | 0 | [63, 221] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_122826__785.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8966 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_122745__870 | 0 | 0.0 | 13.6263 | 0 | [66, 249] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_122745__870.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8967 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231226_122758__363 | 0 | 0.0 | 13.1537 | 0 | [66, 240] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_122758__363.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8968 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_013550__301 | 0 | 0.0 | 13.0602 | 0 | [66, 238] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231227_013550__301.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8969 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_122728__714 | 0 | 0.0 | 3.1303 | 0 | [107, 46] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_122728__714.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8970 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_122731__658 | 0 | 0.0 | 3.01569 | 0 | [107, 44] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_122731__658.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8971 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_013537__276 | 0 | 0.0 | 7.86999 | 0 | [107, 136] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_013537__276.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8972 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_122701__732 | 1 | 0.0 | 11.6019 | 4 | [197, 196] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_122701__732.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8973 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_122724__194 | 1 | 0.0 | 23.5279 | 4 | [197, 417] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_122724__194.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8974 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_013529__736 | 3 | 0.0 | 31.2166 | 4 | [197, 397] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_013529__736.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8975 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_122922__426 | 0 | 0.0 | 19.9279 | 0 | [374, 330] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_122922__426.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8976 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_122941__992 | 0 | 0.0 | 18.6726 | 0 | [374, 307] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_122941__992.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8977 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_013632__726 | 1 | 0.0 | 19.7644 | 4 | [374, 326] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_013632__726.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8978 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_122845__715 | 0 | 0.0 | 18.8985 | 0 | [372, 311] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_122845__715.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8979 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_122902__306 | 0 | 0.0 | 16.9156 | 0 | [372, 275] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_122902__306.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8980 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_013612__584 | 0 | 0.0 | 22.1835 | 0 | [372, 370] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_013612__584.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8981 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_122745__857 | 1 | 0.0 | 58.8855 | 0 | [71, 344] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_122745__857.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8982 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_122826__427 | 5 | 0.0 | 40.5483 | 4 | [71, 237] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_122826__427.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8983 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_122849__318 | 0 | 0.0 | 22.5282 | 0 | [71, 126] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_122849__318.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8984 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_153501__743 | 5 | 0.0 | 50.2937 | 4 | [71, 295] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_153501__743.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8985 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_153539__359 | 1 | 0.0 | 37.5114 | 0 | [71, 205] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_153539__359.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8986 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_122614__446 | 0 | 0.0 | 12.1889 | 0 | [110, 53] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_122614__446.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8987 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_122630__103 | 0 | 0.0 | 16.778 | 0 | [110, 80] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_122630__103.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8988 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_122646__380 | 0 | 0.0 | 16.1973 | 0 | [110, 74] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_122646__380.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8989 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_153343__111 | 0 | 0.0 | 9.55022 | 0 | [110, 40] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_153343__111.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8990 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_153410__845 | 5 | 0.0 | 27.217 | 4 | [110, 149] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_153410__845.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8991 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_122353__268 | 0 | 0.0 | 64.7492 | 0 | [200, 305] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_122353__268.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8992 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_122441__809 | 0 | 0.0 | 48.1443 | 0 | [200, 244] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_122441__809.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8993 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_122601__397 | 5 | 0.0 | 79.7252 | 4 | [200, 419] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_122601__397.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8994 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_153234__750 | 1 | 0.0 | 55.6298 | 0 | [200, 305] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_153234__750.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8995 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_153333__893 | 0 | 0.0 | 59.5287 | 0 | [200, 328] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_153333__893.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8996 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_123146__804 | 0 | 0.0 | 34.6543 | 0 | [384, 151] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_123146__804.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8997 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_123156__109 | 0 | 0.0 | 10.2473 | 0 | [384, 4] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_123156__109.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8998 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_123326__218 | 5 | 0.0 | 90.109 | 4 | [384, 477] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_123326__218.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8999 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_153751__297 | 0 | 0.0 | 10.433 | 0 | [384, 4] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_153751__297.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9000 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_153824__971 | 0 | 0.0 | 33.2001 | 0 | [384, 141] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_153824__971.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9001 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_122951__976 | 3 | 0.0 | 61.7902 | 4 | [382, 312] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_122951__976.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9002 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_123037__575 | 1 | 0.0 | 45.7293 | 0 | [382, 217] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_123037__575.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9003 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_123111__662 | 5 | 0.0 | 34.2941 | 4 | [382, 149] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_123111__662.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9004 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_153612__393 | 1 | 0.0 | 32.7857 | 0 | [382, 138] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_153612__393.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9005 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_153740__682 | 1 | 0.0 | 88.0905 | 0 | [382, 457] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_153740__682.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9006 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_122053__941 | 0 | 0.0 | 8.0277 | 0 | [72, 198] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231225_122053__941.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9007 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_122058__202 | 0 | 0.0 | 4.7415 | 0 | [72, 112] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231225_122058__202.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9008 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_122038__541 | 3 | 0.0 | 6.53778 | 4 | [75, 159] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_122038__541.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9009 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_122045__593 | 0 | 0.0 | 7.10382 | 4 | [75, 174] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_122045__593.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9010 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_013414__238 | 0 | 0.0 | 6.64198 | 0 | [75, 161] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231227_013414__238.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9011 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_122029__210 | 0 | 0.0 | 4.77329 | 0 | [116, 108] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_122029__210.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9012 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_122031__694 | 1 | 0.0 | 2.35263 | 0 | [116, 44] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_122031__694.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9013 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_013407__599 | 3 | 0.0 | 3.68504 | 4 | [116, 79] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_013407__599.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9014 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_122015__574 | 0 | 0.0 | 20.8095 | 0 | [206, 343] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_122015__574.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9015 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_122024__654 | 1 | 0.0 | 9.13778 | 0 | [206, 207] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_122024__654.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9016 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_013403__535 | 0 | 0.0 | 17.3026 | 0 | [206, 263] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_013403__535.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9017 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_122134__140 | 1 | 0.0 | 11.6293 | 0 | [383, 245] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_122134__140.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9018 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_122147__304 | 1 | 0.0 | 11.9915 | 0 | [383, 253] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_122147__304.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9019 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_013443__369 | 1 | 0.0 | 14.1849 | 0 | [383, 307] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_013443__369.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9020 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_122111__956 | 5 | 0.0 | 13.0121 | 4 | [381, 279] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_122111__956.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9021 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_122123__292 | 0 | 0.0 | 11.8215 | 0 | [381, 250] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_122123__292.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9022 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_013429__398 | 1 | 0.0 | 14.7892 | 0 | [381, 322] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_013429__398.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9023 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231214_075333__271 | 0 | 0.0 | 9.72409 | 0 | [52, 296] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231214_075333__271.json | 0.0 | missing | missing | missing | |
| 9024 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231225_114232__685 | 0 | 0.0 | 4.93846 | 0 | [70, 154] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231225_114232__685.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9025 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231225_114238__558 | 0 | 0.0 | 6.05294 | 0 | [70, 191] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231225_114238__558.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9026 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | InJulia | 1SHOT | false | false | 5 | 20231214_075323__883 | 0 | 0.0 | 12.0569 | 0 | [69, 359] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231214_075323__883.json | 0.0 | missing | missing | missing | |
| 9027 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231225_114216__269 | 5 | 0.0 | 5.59987 | 4 | [73, 176] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_114216__269.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9028 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231225_114227__706 | 1 | 0.0 | 11.4762 | 0 | [73, 372] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_114227__706.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9029 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231227_011645__523 | 5 | 0.0 | 8.73798 | 4 | [73, 279] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231227_011645__523.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9030 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_075311__717 | 1 | 0.0 | 8.9848 | 0 | [98, 258] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231214_075311__717.json | 55.0 | missing | missing | missing | |
| 9031 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_114205__239 | 0 | 0.0 | 2.5545 | 0 | [114, 67] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_114205__239.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9032 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_114210__657 | 5 | 0.0 | 5.19515 | 4 | [114, 157] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_114210__657.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9033 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_011636__194 | 5 | 0.0 | 4.22062 | 4 | [114, 123] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231227_011636__194.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9034 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_075302__125 | 0 | 0.0 | 12.4073 | 0 | [188, 331] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231214_075302__125.json | 25.0 | missing | missing | missing | |
| 9035 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_114155__854 | 4 | 0.0 | 15.0408 | 4 | [204, 288] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_114155__854.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9036 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_114202__917 | 3 | 0.0 | 7.19631 | 4 | [204, 208] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_114202__917.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9037 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_011631__176 | 5 | 0.0 | 14.6013 | 4 | [204, 286] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231227_011631__176.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9038 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_075409__680 | 0 | 0.0 | 14.6656 | 0 | [11, 405] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231214_075409__680.json | 0.0 | missing | missing | missing | |
| 9039 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_114303__523 | 0 | 0.0 | 2.69387 | 0 | [381, 33] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_114303__523.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9040 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_114311__595 | 5 | 0.0 | 8.52521 | 4 | [381, 222] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_114311__595.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9041 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_011656__496 | 0 | 0.0 | 2.55718 | 0 | [381, 28] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231227_011656__496.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9042 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_075354__652 | 0 | 0.0 | 21.1672 | 0 | [369, 493] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231214_075354__652.json | 50.0 | missing | missing | missing | |
| 9043 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_114246__271 | 5 | 0.0 | 7.84882 | 4 | [379, 201] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_114246__271.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9044 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_114300__744 | 5 | 0.0 | 13.5753 | 4 | [379, 383] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_114300__744.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9045 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_011653__518 | 5 | 0.0 | 8.46281 | 4 | [379, 219] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231227_011653__518.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9046 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231214_080230__660 | 0 | 0.0 | 10.3956 | 0 | [52, 315] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__AsIs__1SHOT__20231214_080230__660.json | 0.0 | missing | missing | missing | |
| 9047 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231225_115927__766 | 0 | 0.0 | 13.5471 | 0 | [69, 245] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__AsIs__1SHOT__20231225_115927__766.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9048 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231225_115938__879 | 0 | 0.0 | 10.7432 | 0 | [69, 192] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__AsIs__1SHOT__20231225_115938__879.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9049 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231214_080220__844 | 0 | 0.0 | 10.6957 | 0 | [69, 319] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__InJulia__1SHOT__20231214_080220__844.json | 0.0 | missing | missing | missing | |
| 9050 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231225_115903__346 | 3 | 0.0 | 10.3201 | 4 | [72, 184] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__InJulia__1SHOT__20231225_115903__346.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9051 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231225_115913__589 | 3 | 0.0 | 9.80038 | 4 | [72, 174] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__InJulia__1SHOT__20231225_115913__589.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9052 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231227_012452__176 | 0 | 0.0 | 10.4962 | 0 | [72, 186] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__InJulia__1SHOT__20231227_012452__176.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9053 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_080209__449 | 1 | 0.0 | 6.83198 | 0 | [98, 193] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231214_080209__449.json | 55.0 | missing | missing | missing | |
| 9054 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_115843__536 | 0 | 0.0 | 4.54659 | 0 | [111, 68] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_115843__536.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9055 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_115853__191 | 0 | 0.0 | 9.97366 | 0 | [111, 172] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_115853__191.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9056 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_012442__440 | 3 | 0.0 | 9.19185 | 4 | [111, 156] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231227_012442__440.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9057 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_080202__315 | 5 | 0.0 | 12.7731 | 4 | [188, 341] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231214_080202__315.json | 100.0 | missing | missing | missing | |
| 9058 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_115820__136 | 3 | 0.0 | 29.4197 | 4 | [201, 341] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_115820__136.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9059 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_115838__456 | 0 | 0.0 | 17.9023 | 0 | [201, 303] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_115838__456.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9060 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_012432__958 | 0 | 0.0 | 41.7004 | 0 | [201, 566] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231227_012432__958.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9061 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_080315__272 | 0 | 0.0 | 20.2466 | 0 | [11, 549] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231214_080315__272.json | 25.0 | missing | missing | missing | |
| 9062 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_120050__959 | 0 | 0.0 | 39.4556 | 0 | [375, 645] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_120050__959.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9063 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_120114__453 | 0 | 0.0 | 23.4365 | 0 | [375, 370] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_120114__453.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9064 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_012521__951 | 0 | 0.0 | 12.7635 | 0 | [375, 178] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231227_012521__951.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9065 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_080255__610 | 0 | 0.0 | 24.2477 | 0 | [369, 572] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231214_080255__610.json | 50.0 | missing | missing | missing | |
| 9066 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_115950__570 | 0 | 0.0 | 12.869 | 0 | [372, 181] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_115950__570.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9067 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_120010__858 | 0 | 0.0 | 19.9741 | 0 | [372, 309] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_120010__858.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9068 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_012508__515 | 1 | 0.0 | 15.6459 | 0 | [372, 230] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231227_012508__515.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9069 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231225_122304__654 | 0 | 0.0 | 19.5671 | 0 | [62, 744] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231225_122304__654.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9070 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231225_122320__371 | 0 | 0.0 | 16.1617 | 0 | [62, 621] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231225_122320__371.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9071 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_122243__492 | 0 | 0.0 | 15.4193 | 0 | [65, 592] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_122243__492.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9072 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_122244__859 | 0 | 0.0 | 1.40873 | 0 | [65, 48] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_122244__859.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9073 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_013451__664 | 0 | 0.0 | 2.35911 | 0 | [65, 86] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231227_013451__664.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9074 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_122211__324 | 0 | 0.0 | 18.138 | 0 | [102, 681] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_122211__324.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9075 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_122227__818 | 0 | 0.0 | 16.159 | 0 | [102, 611] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_122227__818.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9076 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_013449__993 | 0 | 0.0 | 1.39127 | 0 | [102, 43] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_013449__993.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9077 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_122151__635 | 0 | 0.0 | 4.41918 | 0 | [190, 8] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_122151__635.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9078 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_122153__481 | 0 | 0.0 | 1.6672 | 0 | [190, 45] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_122153__481.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9079 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_013448__498 | 0 | 0.0 | 4.41691 | 0 | [190, 8] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_013448__498.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9080 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_122329__619 | 0 | 0.0 | 1.21424 | 0 | [354, 1] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_122329__619.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9081 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_122330__643 | 0 | 0.0 | 1.22892 | 0 | [354, 1] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_122330__643.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9082 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_013458__702 | 0 | 0.0 | 1.21667 | 0 | [354, 1] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_013458__702.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9083 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_122325__667 | 0 | 0.0 | 5.62783 | 0 | [351, 173] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_122325__667.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9084 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_122328__692 | 0 | 0.0 | 2.45268 | 0 | [351, 52] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_122328__692.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9085 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_013457__106 | 0 | 0.0 | 5.21891 | 0 | [351, 157] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_013457__106.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9086 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231214_080403__365 | 0 | 0.0 | 10.224 | 0 | [52, 310] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231214_080403__365.json | 0.0 | missing | missing | missing | |
| 9087 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231225_120452__519 | 0 | 0.0 | 22.391 | 0 | [77, 168] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_120452__519.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9088 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231225_120505__661 | 0 | 0.0 | 13.3725 | 0 | [77, 94] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_120505__661.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9089 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | InJulia | 1SHOT | false | false | 5 | 20231214_080352__896 | 0 | 0.0 | 10.5665 | 0 | [69, 316] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231214_080352__896.json | 0.0 | missing | missing | missing | |
| 9090 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_120347__377 | 3 | 0.0 | 34.5567 | 4 | [80, 267] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_120347__377.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9091 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_120429__339 | 3 | 0.0 | 41.7046 | 4 | [80, 325] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_120429__339.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9092 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231227_012725__144 | 5 | 0.0 | 29.3991 | 4 | [80, 223] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231227_012725__144.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9093 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_080342__887 | 1 | 0.0 | 9.60226 | 0 | [98, 276] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231214_080342__887.json | 55.0 | missing | missing | missing | |
| 9094 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_120305__803 | 3 | 0.0 | 18.9746 | 4 | [119, 135] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_120305__803.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9095 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_120313__214 | 3 | 0.0 | 7.3003 | 4 | [119, 39] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_120313__214.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9096 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_012656__661 | 0 | 0.0 | 43.8257 | 4 | [119, 333] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231227_012656__661.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9097 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_080332__517 | 2 | 0.0 | 16.8668 | 4 | [188, 455] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_080332__517.json | 85.0 | missing | missing | missing | |
| 9098 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_120211__663 | 3 | 0.0 | 57.8374 | 4 | [209, 256] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_120211__663.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9099 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_120246__100 | 3 | 0.0 | 34.5705 | 4 | [209, 244] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_120246__100.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9100 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_012612__649 | 0 | 0.0 | 51.2141 | 0 | [209, 210] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_012612__649.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9101 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_080421__824 | 0 | 0.0 | 1.33774 | 0 | [11, 33] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_080421__824.json | 0.0 | missing | missing | missing | |
| 9102 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_120701__684 | 0 | 0.0 | 22.2076 | 0 | [383, 115] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_120701__684.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9103 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_120751__362 | 3 | 0.0 | 49.8922 | 4 | [383, 333] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_120751__362.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9104 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_012833__525 | 0 | 0.0 | 37.169 | 0 | [383, 232] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_012833__525.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9105 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_080420__581 | 1 | 0.0 | 16.8592 | 0 | [369, 380] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231214_080420__581.json | 55.0 | missing | missing | missing | |
| 9106 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_120551__192 | 3 | 0.0 | 46.0477 | 4 | [380, 303] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_120551__192.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9107 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_120638__495 | 3 | 0.0 | 47.056 | 4 | [380, 311] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_120638__495.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9108 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_012756__237 | 3 | 0.0 | 30.4719 | 4 | [380, 180] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231227_012756__237.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9109 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_121640__340 | 0 | 0.0 | 12.9894 | 0 | [72, 217] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231225_121640__340.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9110 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_121650__423 | 0 | 0.0 | 9.85113 | 0 | [72, 162] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231225_121650__423.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9111 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_121615__294 | 1 | 0.0 | 15.8222 | 0 | [75, 267] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_121615__294.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9112 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_121627__962 | 5 | 0.0 | 11.8505 | 4 | [75, 197] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_121627__962.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9113 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231227_013215__194 | 0 | 0.0 | 13.6522 | 0 | [75, 228] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231227_013215__194.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9114 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_121545__792 | 3 | 0.0 | 13.8565 | 4 | [116, 227] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_121545__792.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9115 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_121559__188 | 1 | 0.0 | 14.0693 | 0 | [116, 231] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_121559__188.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9116 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_013201__514 | 0 | 0.0 | 13.0467 | 0 | [116, 212] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_013201__514.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9117 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_121515__627 | 1 | 0.0 | 27.8238 | 0 | [206, 294] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_121515__627.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9118 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_121531__576 | 0 | 0.0 | 16.0436 | 0 | [206, 250] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_121531__576.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9119 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_013148__757 | 0 | 0.0 | 23.1094 | 0 | [206, 226] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_013148__757.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9120 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_121749__968 | 1 | 0.0 | 22.678 | 0 | [383, 336] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_121749__968.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9121 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_121811__747 | 1 | 0.0 | 21.6739 | 0 | [383, 319] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_121811__747.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9122 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_013255__151 | 5 | 0.0 | 15.1888 | 4 | [383, 209] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_013255__151.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9123 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_121705__838 | 5 | 0.0 | 15.5506 | 4 | [381, 216] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_121705__838.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9124 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_121727__224 | 0 | 0.0 | 21.0642 | 0 | [381, 309] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_121727__224.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9125 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_013240__198 | 1 | 0.0 | 24.8247 | 0 | [381, 370] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_013240__198.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9126 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231214_080103__118 | 0 | 0.0 | 8.24842 | 0 | [52, 251] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__AsIs__1SHOT__20231214_080103__118.json | 0.0 | missing | missing | missing | |
| 9127 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231225_115727__342 | 0 | 0.0 | 1.98875 | 0 | [72, 108] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_115727__342.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9128 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231225_115730__313 | 0 | 0.0 | 3.00015 | 0 | [72, 170] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_115730__313.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9129 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231214_080055__851 | 0 | 0.0 | 10.6618 | 0 | [69, 318] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__InJulia__1SHOT__20231214_080055__851.json | 0.0 | missing | missing | missing | |
| 9130 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | InJulia | 1SHOT | true | false | 5 | 20231225_115719__293 | 0 | 0.0 | 4.97764 | 0 | [75, 284] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_115719__293.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9131 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | InJulia | 1SHOT | true | true | 5 | 20231225_115725__625 | 0 | 0.0 | 5.11245 | 0 | [75, 291] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_115725__625.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9132 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | InJulia | 1SHOT | true | false | 5 | 20231227_012339__171 | 0 | 0.0 | 5.48043 | 0 | [75, 311] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__InJulia__1SHOT__20231227_012339__171.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9133 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_080044__282 | 5 | 0.0 | 6.17026 | 4 | [98, 172] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231214_080044__282.json | 100.0 | missing | missing | missing | |
| 9134 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_115708__269 | 0 | 0.0 | 1.37596 | 0 | [112, 67] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_115708__269.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9135 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_115714__101 | 0 | 0.0 | 6.07426 | 0 | [112, 340] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_115714__101.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9136 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_012334__872 | 0 | 0.0 | 4.82818 | 0 | [112, 267] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231227_012334__872.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9137 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_080038__288 | 1 | 0.0 | 18.1234 | 0 | [188, 489] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231214_080038__288.json | 55.0 | missing | missing | missing | |
| 9138 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_115704__317 | 0 | 0.0 | 6.11064 | 0 | [197, 173] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_115704__317.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9139 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_115707__681 | 0 | 0.0 | 3.40053 | 0 | [197, 168] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_115707__681.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9140 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_012329__167 | 0 | 0.0 | 11.1561 | 0 | [197, 439] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231227_012329__167.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9141 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_080149__303 | 0 | 0.0 | 22.2493 | 0 | [11, 599] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231214_080149__303.json | 25.0 | missing | missing | missing | |
| 9142 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_115746__487 | 0 | 0.0 | 5.7509 | 0 | [362, 254] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_115746__487.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9143 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_115751__802 | 0 | 0.0 | 5.30094 | 0 | [362, 233] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_115751__802.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9144 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_012351__545 | 0 | 0.0 | 5.77068 | 0 | [362, 257] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231227_012351__545.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9145 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_080127__188 | 0 | 0.0 | 23.62 | 0 | [369, 555] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231214_080127__188.json | 50.0 | missing | missing | missing | |
| 9146 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_115735__923 | 0 | 0.0 | 4.59024 | 0 | [360, 188] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_115735__923.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9147 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_115740__284 | 0 | 0.0 | 4.94026 | 0 | [360, 213] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_115740__284.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9148 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_012345__392 | 0 | 0.0 | 5.32464 | 0 | [360, 234] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231227_012345__392.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9149 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231214_075446__857 | 0 | 0.0 | 9.79375 | 0 | [52, 298] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__AsIs__1SHOT__20231214_075446__857.json | 0.0 | missing | missing | missing | |
| 9150 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231225_114424__413 | 0 | 0.0 | 7.53145 | 0 | [72, 240] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__AsIs__1SHOT__20231225_114424__413.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9151 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231225_114429__318 | 0 | 0.0 | 4.37549 | 0 | [72, 134] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__AsIs__1SHOT__20231225_114429__318.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9152 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | InJulia | 1SHOT | false | false | 5 | 20231214_075436__541 | 0 | 0.0 | 9.90712 | 0 | [69, 295] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__InJulia__1SHOT__20231214_075436__541.json | 0.0 | missing | missing | missing | |
| 9153 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_114412__643 | 2 | 0.0 | 8.55479 | 4 | [75, 274] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_114412__643.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9154 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_114417__312 | 1 | 0.0 | 4.7001 | 0 | [75, 144] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_114417__312.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9155 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231227_011727__392 | 1 | 0.0 | 6.60204 | 0 | [75, 208] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__InJulia__1SHOT__20231227_011727__392.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9156 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_075426__293 | 2 | 0.0 | 7.26552 | 4 | [98, 205] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231214_075426__293.json | 85.0 | missing | missing | missing | |
| 9157 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_114355__290 | 1 | 0.0 | 8.65511 | 0 | [116, 272] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_114355__290.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9158 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_114403__886 | 2 | 0.0 | 7.69726 | 4 | [116, 239] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_114403__886.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9159 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_011720__313 | 1 | 0.0 | 6.00522 | 0 | [116, 182] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231227_011720__313.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9160 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_075419__423 | 1 | 0.0 | 10.0351 | 0 | [188, 263] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231214_075419__423.json | 55.0 | missing | missing | missing | |
| 9161 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_114332__505 | 5 | 0.0 | 20.7529 | 4 | [206, 472] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_114332__505.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9162 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_114346__137 | 5 | 0.0 | 14.3472 | 4 | [206, 439] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_114346__137.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9163 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_011714__123 | 5 | 0.0 | 18.1528 | 4 | [206, 394] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231227_011714__123.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9164 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_075516__789 | 0 | 0.0 | 10.1656 | 0 | [11, 283] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231214_075516__789.json | 0.0 | missing | missing | missing | |
| 9165 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_114456__603 | 2 | 0.0 | 9.28117 | 4 | [383, 246] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_114456__603.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9166 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_114506__915 | 1 | 0.0 | 9.7461 | 0 | [383, 261] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_114506__915.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9167 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_011755__640 | 1 | 0.0 | 9.16901 | 0 | [383, 241] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231227_011755__640.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9168 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_075506__423 | 1 | 0.0 | 19.4093 | 0 | [369, 447] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231214_075506__423.json | 55.0 | missing | missing | missing | |
| 9169 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_114439__588 | 1 | 0.0 | 10.4916 | 0 | [381, 284] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_114439__588.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9170 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_114447__151 | 5 | 0.0 | 7.1925 | 4 | [381, 179] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_114447__151.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9171 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_011746__925 | 0 | 0.0 | 18.9519 | 0 | [381, 544] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231227_011746__925.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9172 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231214_075546__815 | 0 | 0.0 | 6.889 | 0 | [52, 208] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__AsIs__1SHOT__20231214_075546__815.json | 0.0 | missing | missing | missing | |
| 9173 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231225_114911__976 | 0 | 0.0 | 23.1271 | 0 | [68, 169] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__AsIs__1SHOT__20231225_114911__976.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9174 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231225_114943__142 | 0 | 0.0 | 31.6439 | 0 | [68, 236] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__AsIs__1SHOT__20231225_114943__142.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9175 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | InJulia | 1SHOT | false | false | 5 | 20231214_075539__868 | 0 | 0.0 | 9.75106 | 0 | [69, 291] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__InJulia__1SHOT__20231214_075539__868.json | 0.0 | missing | missing | missing | |
| 9176 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231225_114804__476 | 1 | 0.0 | 70.6903 | 0 | [71, 536] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_114804__476.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9177 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231225_114848__700 | 0 | 0.0 | 43.5071 | 0 | [71, 328] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_114848__700.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9178 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231227_011947__677 | 1 | 0.0 | 58.5035 | 0 | [71, 441] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__InJulia__1SHOT__20231227_011947__677.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9179 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_075529__292 | 1 | 0.0 | 7.70736 | 0 | [98, 220] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231214_075529__292.json | 55.0 | missing | missing | missing | |
| 9180 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_114629__365 | 5 | 0.0 | 7.10941 | 4 | [110, 36] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_114629__365.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9181 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_114654__221 | 1 | 0.0 | 23.8852 | 0 | [110, 169] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_114654__221.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9182 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_011848__599 | 5 | 0.0 | 14.3099 | 4 | [110, 93] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231227_011848__599.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9183 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_075522__503 | 0 | 0.0 | 5.57692 | 0 | [188, 130] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231214_075522__503.json | 0.0 | missing | missing | missing | |
| 9184 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_114552__743 | 1 | 0.0 | 45.6022 | 4 | [200, 139] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_114552__743.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9185 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_114622__102 | 5 | 0.0 | 30.5297 | 4 | [200, 204] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_114622__102.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9186 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_011834__857 | 0 | 0.0 | 38.6292 | 0 | [200, 92] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231227_011834__857.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9187 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_075622__350 | 0 | 0.0 | 16.2484 | 0 | [11, 445] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231214_075622__350.json | 50.0 | missing | missing | missing | |
| 9188 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_115110__626 | 1 | 0.0 | 15.4574 | 0 | [384, 59] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_115110__626.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9189 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_115209__114 | 5 | 0.0 | 59.1356 | 4 | [384, 388] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_115209__114.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9190 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_012139__677 | 0 | 0.0 | 63.0962 | 0 | [384, 416] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231227_012139__677.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9191 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_075606__266 | 1 | 0.0 | 19.3083 | 0 | [369, 444] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231214_075606__266.json | 55.0 | missing | missing | missing | |
| 9192 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_115020__152 | 5 | 0.0 | 36.9208 | 4 | [382, 222] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_115020__152.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9193 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_115055__164 | 0 | 0.0 | 34.2584 | 0 | [382, 202] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_115055__164.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9194 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_012035__154 | 0 | 0.0 | 48.2187 | 0 | [382, 306] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231227_012035__154.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9195 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231214_081315__420 | 0 | 0.0 | 20.7942 | 0 | [53, 611] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231214_081315__420.json | 0.0 | missing | missing | missing | |
| 9196 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231225_125113__341 | 0 | 0.0 | 26.2086 | 0 | [75, 481] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_125113__341.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9197 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231225_125126__261 | 0 | 0.0 | 12.4319 | 0 | [75, 224] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_125126__261.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9198 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | InJulia | 1SHOT | true | false | 5 | 20231214_081254__785 | 0 | 0.0 | 16.7285 | 0 | [70, 493] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231214_081254__785.json | 25.0 | missing | missing | missing | |
| 9199 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | InJulia | 1SHOT | false | false | 5 | 20231225_125019__725 | 0 | 0.0 | 25.712 | 0 | [78, 473] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_125019__725.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9200 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | InJulia | 1SHOT | true | false | 5 | 20231225_125047__539 | 0 | 0.0 | 28.2169 | 0 | [78, 518] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_125047__539.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9201 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | InJulia | 1SHOT | true | false | 5 | 20231227_015034__563 | 0 | 0.0 | 24.8472 | 0 | [78, 450] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231227_015034__563.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9202 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_081237__359 | 0 | 0.0 | 18.9857 | 0 | [99, 544] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231214_081237__359.json | 50.0 | missing | missing | missing | |
| 9203 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_124935__414 | 0 | 0.0 | 18.7463 | 0 | [116, 338] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_124935__414.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9204 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_124953__383 | 0 | 0.0 | 17.3474 | 0 | [116, 312] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_124953__383.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9205 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_014809__338 | 0 | 0.0 | 12.9204 | 1 | [116, 227] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231227_014809__338.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9206 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_081218__584 | 0 | 0.0 | 20.6979 | 0 | [187, 558] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231214_081218__584.json | 25.0 | missing | missing | missing | |
| 9207 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_124855__288 | 0 | 0.0 | 42.0522 | 0 | [205, 564] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_124855__288.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9208 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_124917__503 | 0 | 0.0 | 22.041 | 1 | [205, 380] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_124917__503.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9209 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_014756__213 | 0 | 0.0 | 42.1065 | 0 | [205, 568] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231227_014756__213.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9210 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_081406__119 | 0 | 0.0 | 21.1032 | 0 | [11, 570] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231214_081406__119.json | 50.0 | missing | missing | missing | |
| 9211 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_125232__993 | 0 | 0.0 | 20.8633 | 3 | [381, 326] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_125232__993.json | 68.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 9212 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_125307__928 | 0 | 0.0 | 33.9336 | 0 | [381, 546] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_125307__928.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9213 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_015125__150 | 0 | 0.0 | 23.9987 | 0 | [381, 379] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231227_015125__150.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9214 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_081344__329 | 0 | 0.0 | 29.2769 | 0 | [370, 694] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231214_081344__329.json | 25.0 | missing | missing | missing | |
| 9215 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_125156__595 | 0 | 0.0 | 30.4516 | 0 | [378, 494] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_125156__595.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9216 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_125211__963 | 0 | 0.0 | 14.819 | 0 | [378, 217] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_125211__963.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9217 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_015101__839 | 0 | 0.0 | 26.2985 | 0 | [378, 419] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231227_015101__839.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9218 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_031134__756 | 0 | 0.0 | 4.24615 | 0 | [0, 315] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_031134__756.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9219 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20240201_031141__760 | 0 | 0.0 | 7.00728 | 0 | [0, 516] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_031141__760.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9220 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_031146__819 | 1 | 0.0 | 4.93574 | 4 | [0, 365] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_031146__819.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9221 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_031158__551 | 1 | 0.0 | 12.176 | 2 | [0, 881] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_031158__551.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9222 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_031206__386 | 3 | 0.0 | 7.23861 | 4 | [0, 533] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_031206__386.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9223 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_030143__271 | 1 | 0.0 | 4.2253 | 2 | [0, 313] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_030143__271.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9224 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_030146__967 | 0 | 0.0 | 2.69032 | 0 | [0, 195] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_030146__967.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9225 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_030149__964 | 1 | 0.0 | 2.60932 | 1 | [0, 193] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_030149__964.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9226 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_030152__743 | 0 | 0.0 | 2.50553 | 0 | [0, 185] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_030152__743.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9227 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_030154__652 | 0 | 0.0 | 1.86962 | 1 | [0, 142] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_030154__652.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9228 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_030057__425 | 1 | 0.0 | 9.07063 | 2 | [0, 652] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_030057__425.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9229 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_030102__443 | 0 | 0.0 | 4.9925 | 0 | [0, 371] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_030102__443.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9230 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_030109__212 | 0 | 0.0 | 7.30146 | 0 | [0, 535] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_030109__212.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9231 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_030115__767 | 0 | 0.0 | 5.89087 | 0 | [0, 439] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_030115__767.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9232 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_030119__759 | 1 | 0.0 | 3.99171 | 1 | [0, 295] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_030119__759.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9233 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240201_031924__733 | 0 | 0.0 | 3.51629 | 0 | [0, 257] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_031924__733.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9234 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_031931__851 | 0 | 0.0 | 6.93434 | 0 | [0, 505] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_031931__851.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9235 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_031936__361 | 0 | 0.0 | 5.41943 | 0 | [0, 395] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_031936__361.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9236 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240201_031941__249 | 0 | 0.0 | 4.68775 | 0 | [0, 342] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_031941__249.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9237 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_031947__512 | 3 | 0.0 | 5.81954 | 4 | [0, 426] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_031947__512.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9238 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_031240__887 | 0 | 0.0 | 4.25891 | 0 | [0, 313] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_031240__887.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9239 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_031245__676 | 0 | 0.0 | 5.41984 | 1 | [0, 392] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_031245__676.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9240 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_031249__459 | 0 | 0.0 | 2.31048 | 0 | [0, 171] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_031249__459.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9241 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_031256__608 | 1 | 0.0 | 6.99707 | 1 | [0, 511] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_031256__608.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9242 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_031658__992 | 0 | 0.0 | 1.56618 | 1 | [0, 107] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_031658__992.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9243 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231214_081519__171 | 0 | 0.0 | 22.043 | 0 | [53, 645] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__AsIs__1SHOT__20231214_081519__171.json | 0.0 | missing | missing | missing | |
| 9244 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231225_125506__415 | 0 | 0.0 | 18.5099 | 0 | [49, 340] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__AsIs__1SHOT__20231225_125506__415.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9245 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231225_125518__651 | 0 | 0.0 | 11.6668 | 0 | [49, 213] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__AsIs__1SHOT__20231225_125518__651.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9246 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | InJulia | 1SHOT | true | false | 5 | 20231214_081457__126 | 0 | 0.0 | 13.8492 | 0 | [70, 411] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__InJulia__1SHOT__20231214_081457__126.json | 25.0 | missing | missing | missing | |
| 9247 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_125408__196 | 0 | 0.0 | 20.3985 | 0 | [52, 377] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_125408__196.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9248 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_125448__622 | 0 | 0.0 | 39.8091 | 0 | [52, 720] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_125448__622.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9249 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_081443__680 | 0 | 0.0 | 18.3441 | 0 | [99, 526] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231214_081443__680.json | 0.0 | missing | missing | missing | |
| 9250 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_125323__492 | 0 | 0.0 | 1.60976 | 0 | [53, 20] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_125323__492.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9251 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_125348__338 | 0 | 0.0 | 24.6256 | 0 | [53, 450] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_125348__338.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9252 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_081424__916 | 0 | 0.0 | 18.6604 | 0 | [187, 504] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231214_081424__916.json | 0.0 | missing | missing | missing | |
| 9253 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_125319__140 | 0 | 0.0 | 12.3596 | 0 | [80, 28] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_125319__140.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9254 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_125321__916 | 0 | 0.0 | 2.00512 | 0 | [80, 23] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_125321__916.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9255 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_081606__122 | 0 | 0.0 | 23.9125 | 0 | [11, 640] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231214_081606__122.json | 50.0 | missing | missing | missing | |
| 9256 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_125527__297 | 0 | 0.0 | 1.37585 | 0 | [70, 11] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_125527__297.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9257 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_125530__311 | 0 | 0.0 | 3.09635 | 0 | [70, 43] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_125530__311.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9258 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_081542__814 | 0 | 0.0 | 22.9211 | 0 | [370, 537] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231214_081542__814.json | 25.0 | missing | missing | missing | |
| 9259 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_125522__438 | 0 | 0.0 | 3.60002 | 0 | [67, 54] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_125522__438.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9260 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_125525__633 | 0 | 0.0 | 3.64621 | 0 | [67, 55] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_125525__633.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9261 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_034241__348 | 1 | 0.0 | 13.1858 | 1 | [0, 475] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_034241__348.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9262 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_034251__363 | 0 | 0.0 | 9.63403 | 0 | [0, 348] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_034251__363.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9263 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20240201_034301__536 | 0 | 0.0 | 9.89115 | 0 | [0, 357] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_034301__536.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9264 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20240201_034311__813 | 0 | 0.0 | 10.1844 | 0 | [0, 367] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_034311__813.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9265 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_034324__638 | 0 | 0.0 | 12.5433 | 0 | [0, 452] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_034324__638.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9266 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_033523__989 | 0 | 0.0 | 5.60183 | 0 | [0, 202] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_033523__989.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9267 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_033533__379 | 0 | 0.0 | 9.62462 | 0 | [0, 346] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_033533__379.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9268 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240201_034051__641 | 0 | 0.0 | 17.8094 | 0 | [116, 403] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_034051__641.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9269 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_034102__125 | 0 | 0.0 | 10.507 | 1 | [0, 377] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_034102__125.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9270 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240201_034112__122 | 0 | 0.0 | 10.1988 | 0 | [0, 367] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_034112__122.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9271 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_032848__555 | 0 | 0.0 | 10.2752 | 0 | [0, 367] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_032848__555.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9272 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_033403__954 | 0 | 0.0 | 15.1067 | 0 | [205, 310] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_033403__954.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9273 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_033412__230 | 0 | 0.0 | 8.68011 | 0 | [0, 310] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_033412__230.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9274 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_033423__939 | 0 | 0.0 | 11.3598 | 0 | [0, 402] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_033423__939.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9275 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_033440__160 | 0 | 0.0 | 16.5294 | 0 | [0, 586] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_033440__160.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9276 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_035407__856 | 1 | 0.0 | 30.0817 | 1 | [0, 1062] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_035407__856.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9277 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_035415__524 | 0 | 0.0 | 8.56909 | 0 | [0, 303] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_035415__524.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9278 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240201_035434__401 | 0 | 0.0 | 18.4086 | 0 | [0, 651] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_035434__401.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9279 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_035445__746 | 0 | 0.0 | 10.8037 | 0 | [0, 383] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_035445__746.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9280 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_035445__874 | 0 | 0.0 | 0.124484 | 0 | [0, 4] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_035445__874.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9281 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_034445__957 | 0 | 0.0 | 8.38281 | 1 | [0, 298] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_034445__957.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9282 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_034456__695 | 0 | 0.0 | 10.4444 | 0 | [0, 372] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_034456__695.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9283 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_034507__895 | 1 | 0.0 | 10.7095 | 1 | [0, 381] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_034507__895.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9284 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_034520__132 | 1 | 0.0 | 13.4418 | 1 | [0, 477] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_034520__132.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9285 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_035248__498 | 0 | 0.0 | 27.0595 | 0 | [378, 218] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_035248__498.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9286 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240201_025343__772 | 0 | 0.0 | 22.3987 | 0 | [84, 539] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_025343__772.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9287 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240201_025401__163 | 0 | 0.0 | 17.4185 | 0 | [0, 427] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_025401__163.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9288 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 5 | 20240201_025425__858 | 1 | 0.0 | 24.0645 | 2 | [0, 587] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_025425__858.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9289 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 5 | 20240201_025444__549 | 1 | 0.0 | 18.2571 | 1 | [0, 448] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_025444__549.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9290 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240201_025503__547 | 0 | 0.0 | 19.6804 | 0 | [0, 482] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_025503__547.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9291 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_024943__868 | 0 | 0.0 | 14.2185 | 0 | [0, 348] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_024943__868.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9292 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_025007__103 | 1 | 0.0 | 23.7307 | 1 | [0, 577] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_025007__103.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9293 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_025019__815 | 0 | 0.0 | 11.2979 | 0 | [0, 276] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_025019__815.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9294 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_025040__164 | 1 | 0.0 | 20.8579 | 2 | [0, 508] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_025040__164.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9295 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_025046__112 | 0 | 0.0 | 6.6504 | 0 | [0, 163] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_025046__112.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9296 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_024528__970 | 0 | 0.0 | 23.6089 | 0 | [0, 572] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_024528__970.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9297 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_024739__211 | 0 | 0.0 | 10.4343 | 0 | [0, 253] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_024739__211.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9298 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_024749__550 | 0 | 0.0 | 9.91108 | 0 | [0, 243] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_024749__550.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9299 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_024755__795 | 0 | 0.0 | 6.18915 | 0 | [0, 152] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_024755__795.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9300 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_024808__547 | 1 | 0.0 | 13.3177 | 1 | [0, 324] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_024808__547.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9301 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_025927__834 | 0 | 0.0 | 7.64656 | 0 | [0, 185] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_025927__834.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9302 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_025948__502 | 0 | 0.0 | 21.2761 | 0 | [0, 511] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_025948__502.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9303 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_025952__198 | 0 | 0.0 | 3.17047 | 0 | [0, 77] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_025952__198.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9304 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_030011__230 | 0 | 0.0 | 18.9489 | 0 | [0, 456] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_030011__230.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9305 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_030019__849 | 0 | 0.0 | 7.78058 | 0 | [0, 186] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_030019__849.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9306 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_025708__636 | 0 | 0.0 | 16.0352 | 0 | [0, 386] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_025708__636.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9307 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_025719__598 | 0 | 0.0 | 10.9967 | 0 | [0, 265] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_025719__598.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9308 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20240201_025727__450 | 0 | 0.0 | 8.43641 | 0 | [0, 204] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_025727__450.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9309 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20240201_025751__764 | 0 | 0.0 | 23.8884 | 0 | [0, 574] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_025751__764.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9310 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_025809__459 | 0 | 0.0 | 17.3131 | 0 | [0, 417] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_025809__459.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9311 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20240201_023406__786 | 0 | 0.0 | 27.8598 | 0 | [0, 516] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_023406__786.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9312 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_023440__650 | 0 | 0.0 | 34.3196 | 0 | [0, 636] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_023440__650.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9313 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_023502__742 | 1 | 0.0 | 21.7708 | 1 | [0, 405] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_023502__742.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9314 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20240201_023534__868 | 0 | 0.0 | 31.5829 | 0 | [0, 586] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_023534__868.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9315 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_023559__150 | 1 | 0.0 | 24.4019 | 1 | [0, 455] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_023559__150.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9316 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_023015__815 | 1 | 0.0 | 23.1322 | 2 | [0, 431] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_023015__815.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9317 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_023042__455 | 1 | 0.0 | 26.3911 | 1 | [0, 491] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_023042__455.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9318 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_023112__352 | 0 | 0.0 | 29.9494 | 0 | [0, 556] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_023112__352.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9319 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_023114__669 | 0 | 0.0 | 2.18939 | 0 | [0, 41] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_023114__669.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9320 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_023141__342 | 0 | 0.0 | 26.1833 | 0 | [0, 487] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_023141__342.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9321 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_022630__274 | 0 | 0.0 | 14.4585 | 0 | [0, 269] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_022630__274.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9322 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_022645__734 | 2 | 0.0 | 15.7016 | 2 | [0, 292] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_022645__734.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9323 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_022720__112 | 0 | 0.0 | 34.7706 | 0 | [0, 642] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_022720__112.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9324 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_022748__685 | 1 | 0.0 | 27.5515 | 1 | [0, 511] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_022748__685.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9325 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_022805__416 | 1 | 0.0 | 16.7513 | 1 | [0, 312] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_022805__416.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9326 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_024241__568 | 2 | 0.0 | 16.3196 | 2 | [0, 301] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_024241__568.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9327 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_024253__159 | 0 | 0.0 | 11.2827 | 0 | [0, 208] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_024253__159.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9328 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_024303__823 | 0 | 0.0 | 9.96854 | 0 | [0, 184] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_024303__823.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9329 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_024321__809 | 1 | 0.0 | 17.9931 | 1 | [0, 331] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_024321__809.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9330 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_024331__918 | 0 | 0.0 | 10.1736 | 0 | [0, 188] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_024331__918.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9331 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_023920__365 | 0 | 0.0 | 60.0652 | 0 | [0, 1100] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_023920__365.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9332 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_023941__271 | 2 | 0.0 | 21.0998 | 2 | [0, 388] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_023941__271.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9333 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_024000__994 | 1 | 0.0 | 19.0109 | 4 | [0, 350] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_024000__994.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9334 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_024009__690 | 0 | 0.0 | 8.96218 | 0 | [0, 165] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_024009__690.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9335 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20240201_024032__193 | 0 | 0.0 | 22.7034 | 0 | [0, 417] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_024032__193.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9336 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20240201_032608__183 | 0 | 0.0 | 1.96278 | 0 | [0, 231] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_032608__183.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9337 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20240201_032610__902 | 0 | 0.0 | 1.33066 | 0 | [0, 155] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_032610__902.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9338 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_032611__364 | 1 | 0.0 | 1.55657 | 1 | [0, 181] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_032611__364.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9339 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_032613__740 | 0 | 0.0 | 1.33517 | 0 | [0, 155] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_032613__740.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9340 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_032614__186 | 0 | 0.0 | 1.11268 | 4 | [0, 132] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_032614__186.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9341 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_032034__483 | 0 | 0.0 | 2.44474 | 0 | [0, 292] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_032034__483.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9342 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_032039__423 | 0 | 0.0 | 5.44356 | 0 | [0, 647] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_032039__423.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9343 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_032041__483 | 0 | 0.0 | 0.900438 | 4 | [0, 109] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_032041__483.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9344 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_032042__229 | 0 | 0.0 | 1.43955 | 0 | [0, 174] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_032042__229.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9345 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240201_032047__178 | 0 | 0.0 | 4.80232 | 0 | [0, 577] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_032047__178.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9346 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_032008__617 | 0 | 0.0 | 2.97499 | 0 | [0, 346] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_032008__617.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9347 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_032011__420 | 1 | 0.0 | 3.02818 | 2 | [0, 350] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_032011__420.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9348 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_032013__727 | 0 | 0.0 | 1.95978 | 0 | [0, 228] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_032013__727.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9349 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_032014__465 | 0 | 0.0 | 1.15492 | 0 | [0, 134] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_032014__465.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9350 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_032015__938 | 0 | 0.0 | 0.721582 | 0 | [0, 85] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_032015__938.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9351 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_032722__127 | 0 | 0.0 | 5.32643 | 0 | [0, 612] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_032722__127.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9352 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_032726__389 | 1 | 0.0 | 3.47362 | 2 | [0, 405] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_032726__389.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9353 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_032732__322 | 1 | 0.0 | 5.57511 | 4 | [0, 644] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_032732__322.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9354 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_032736__225 | 1 | 0.0 | 4.27544 | 1 | [0, 497] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_032736__225.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9355 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_032738__722 | 0 | 0.0 | 1.26281 | 0 | [0, 149] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_032738__722.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9356 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_032642__897 | 2 | 0.0 | 5.5855 | 4 | [0, 644] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_032642__897.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9357 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_032646__617 | 1 | 0.0 | 4.25938 | 4 | [0, 495] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_032646__617.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9358 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20240201_032649__791 | 0 | 0.0 | 2.59933 | 0 | [0, 304] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_032649__791.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9359 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_032654__606 | 0 | 0.0 | 4.75126 | 0 | [0, 551] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_032654__606.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9360 | NVIDIA-RTX-4090-4x | pig_latinify | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_032657__148 | 1 | 0.0 | 3.26534 | 4 | [0, 382] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_032657__148.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9361 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_134213__293 | 0 | 0.0 | 127.423 | 0 | [64, 777] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_134213__293.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9362 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_134347__844 | 0 | 0.0 | 93.6402 | 0 | [64, 575] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_134347__844.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9363 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231225_133915__199 | 0 | 0.0 | 50.9174 | 0 | [67, 308] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_133915__199.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9364 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_134006__603 | 1 | 0.0 | 50.6024 | 2 | [67, 306] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_134006__603.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9365 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_020752__373 | 0 | 0.0 | 54.759 | 0 | [67, 331] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_020752__373.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9366 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_133700__108 | 0 | 0.0 | 70.3717 | 0 | [108, 422] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_133700__108.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9367 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_133823__159 | 4 | 0.0 | 82.8659 | 3 | [108, 498] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_133823__159.json | 88.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 9368 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_020656__422 | 0 | 0.0 | 47.845 | 0 | [108, 283] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_020656__422.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9369 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_133508__568 | 2 | 0.0 | 83.0887 | 2 | [197, 322] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_133508__568.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9370 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_133550__551 | 2 | 0.0 | 41.6344 | 2 | [197, 228] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_133550__551.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9371 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_020609__962 | 2 | 0.0 | 79.869 | 2 | [197, 317] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_020609__962.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9372 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_134740__983 | 0 | 0.0 | 71.6321 | 0 | [396, 376] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_134740__983.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9373 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_134848__152 | 0 | 0.0 | 68.5964 | 0 | [396, 358] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_134848__152.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9374 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_021005__235 | 3 | 0.0 | 44.5888 | 2 | [396, 213] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_021005__235.json | 77.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9375 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_134515__641 | 0 | 0.0 | 88.1433 | 0 | [394, 474] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_134515__641.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9376 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_134628__353 | 3 | 0.0 | 71.6973 | 4 | [394, 376] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_134628__353.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9377 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_020920__515 | 3 | 0.0 | 88.1929 | 2 | [394, 474] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_020920__515.json | 77.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9378 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_022003__367 | 0 | 0.0 | 11.8422 | 0 | [69, 454] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_022003__367.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9379 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_123502__623 | 1 | 0.0 | 10.9055 | 1 | [69, 420] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_123502__623.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9380 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_123513__823 | 0 | 0.0 | 11.7069 | 0 | [69, 450] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_123513__823.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9381 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_123534__678 | 0 | 0.0 | 20.1983 | 0 | [69, 757] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_123534__678.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9382 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_021951__337 | 0 | 0.0 | 13.6045 | 0 | [106, 513] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_021951__337.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9383 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_123422__713 | 0 | 0.0 | 11.5347 | 0 | [106, 437] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_123422__713.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9384 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_123437__851 | 0 | 0.0 | 15.0882 | 0 | [106, 568] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_123437__851.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9385 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_123450__681 | 0 | 0.0 | 13.6626 | 0 | [106, 516] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_123450__681.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9386 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_021938__671 | 0 | 0.0 | 11.2671 | 0 | [193, 284] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_021938__671.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9387 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_123340__512 | 0 | 0.0 | 13.131 | 0 | [193, 347] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_123340__512.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9388 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_123357__794 | 0 | 0.0 | 17.1689 | 0 | [193, 621] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_123357__794.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9389 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_123410__248 | 0 | 0.0 | 13.2974 | 0 | [193, 483] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_123410__248.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9390 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_022028__969 | 0 | 0.0 | 11.8747 | 0 | [358, 396] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_022028__969.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9391 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_123652__250 | 0 | 0.0 | 26.2219 | 0 | [358, 882] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_123652__250.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9392 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_123704__657 | 0 | 0.0 | 11.9096 | 0 | [358, 398] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_123704__657.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9393 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_123719__114 | 0 | 0.0 | 14.6431 | 0 | [358, 494] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_123719__114.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9394 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_022016__963 | 0 | 0.0 | 12.5721 | 0 | [355, 421] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_022016__963.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9395 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_123555__345 | 0 | 0.0 | 21.7296 | 0 | [355, 736] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_123555__345.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9396 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_123610__556 | 0 | 0.0 | 14.6353 | 0 | [355, 494] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_123610__556.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9397 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_123626__310 | 0 | 0.0 | 15.7251 | 0 | [355, 532] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_123626__310.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9398 | Apple-MacBook-Pro-M1 | pig_latinify | gemini-1.0-pro-latest | InJulia | 1SHOT | true | false | 5 | 20240217_111527__367 | 0 | 0.0 | 5.5342 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_111527__367.json | 25.0 | missing | missing | missing | |
| 9399 | Apple-MacBook-Pro-M1 | pig_latinify | gemini-1.0-pro-latest | InJulia | 1SHOT | true | false | 5 | 20240217_111530__813 | 0 | 0.0 | 3.32046 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_111530__813.json | 25.0 | missing | missing | missing | |
| 9400 | Apple-MacBook-Pro-M1 | pig_latinify | gemini-1.0-pro-latest | InJulia | 1SHOT | true | false | 5 | 20240217_111533__183 | 0 | 0.0 | 2.53705 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_111533__183.json | 25.0 | missing | missing | missing | |
| 9401 | Apple-MacBook-Pro-M1 | pig_latinify | gemini-1.0-pro-latest | InJulia | 1SHOT | true | false | 5 | 20240217_111538__807 | 0 | 0.0 | 4.85096 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_111538__807.json | 25.0 | missing | missing | missing | |
| 9402 | Apple-MacBook-Pro-M1 | pig_latinify | gemini-1.0-pro-latest | InJulia | 1SHOT | true | false | 5 | 20240217_111541__690 | 0 | 0.0 | 2.70809 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_111541__690.json | 25.0 | missing | missing | missing | |
| 9403 | Apple-MacBook-Pro-M1 | pig_latinify | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_111448__761 | 0 | 0.0 | 3.43833 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_111448__761.json | 50.0 | missing | missing | missing | |
| 9404 | Apple-MacBook-Pro-M1 | pig_latinify | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_111450__212 | 0 | 0.0 | 2.84171 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_111450__212.json | 50.0 | missing | missing | missing | |
| 9405 | Apple-MacBook-Pro-M1 | pig_latinify | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240217_111452__156 | 0 | 0.0 | 1.93634 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_111452__156.json | 25.0 | missing | missing | missing | |
| 9406 | Apple-MacBook-Pro-M1 | pig_latinify | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240217_111457__460 | 0 | 0.0 | 4.20532 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_111457__460.json | 25.0 | missing | missing | missing | |
| 9407 | Apple-MacBook-Pro-M1 | pig_latinify | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240217_111500__668 | 0 | 0.0 | 3.55957 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_111500__668.json | 25.0 | missing | missing | missing | |
| 9408 | Apple-MacBook-Pro-M1 | pig_latinify | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240217_111356__485 | 0 | 0.0 | 5.4575 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_111356__485.json | 25.0 | missing | missing | missing | |
| 9409 | Apple-MacBook-Pro-M1 | pig_latinify | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240217_111404__546 | 0 | 0.0 | 7.82127 | 4 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_111404__546.json | 75.0 | missing | missing | missing | |
| 9410 | Apple-MacBook-Pro-M1 | pig_latinify | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240217_111410__998 | 0 | 0.0 | 5.7718 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_111410__998.json | 0.0 | missing | missing | missing | |
| 9411 | Apple-MacBook-Pro-M1 | pig_latinify | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240217_111414__959 | 0 | 0.0 | 4.04215 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_111414__959.json | 25.0 | missing | missing | missing | |
| 9412 | Apple-MacBook-Pro-M1 | pig_latinify | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240217_111418__860 | 0 | 0.0 | 4.05922 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_111418__860.json | 25.0 | missing | missing | missing | |
| 9413 | Apple-MacBook-Pro-M1 | pig_latinify | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240217_111747__346 | 0 | 0.0 | 3.80506 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_111747__346.json | 25.0 | missing | missing | missing | |
| 9414 | Apple-MacBook-Pro-M1 | pig_latinify | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240217_111751__914 | 0 | 0.0 | 3.74634 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_111751__914.json | 0.0 | missing | missing | missing | |
| 9415 | Apple-MacBook-Pro-M1 | pig_latinify | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240217_111754__274 | 0 | 0.0 | 3.13216 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_111754__274.json | 0.0 | missing | missing | missing | |
| 9416 | Apple-MacBook-Pro-M1 | pig_latinify | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240217_111802__880 | 0 | 0.0 | 7.46466 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_111802__880.json | 0.0 | missing | missing | missing | |
| 9417 | Apple-MacBook-Pro-M1 | pig_latinify | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240217_111807__656 | 0 | 0.0 | 4.89494 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_111807__656.json | 25.0 | missing | missing | missing | |
| 9418 | Apple-MacBook-Pro-M1 | pig_latinify | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20240217_111627__210 | 0 | 0.0 | 6.90295 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_111627__210.json | 0.0 | missing | missing | missing | |
| 9419 | Apple-MacBook-Pro-M1 | pig_latinify | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20240217_111634__245 | 0 | 0.0 | 6.63272 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_111634__245.json | 0.0 | missing | missing | missing | |
| 9420 | Apple-MacBook-Pro-M1 | pig_latinify | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | false | 5 | 20240217_111640__770 | 0 | 0.0 | 6.27229 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_111640__770.json | 25.0 | missing | missing | missing | |
| 9421 | Apple-MacBook-Pro-M1 | pig_latinify | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20240217_111655__673 | 0 | 0.0 | 13.9826 | 0 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_111655__673.json | 0.0 | missing | missing | missing | |
| 9422 | Apple-MacBook-Pro-M1 | pig_latinify | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_111658__173 | 1 | 0.0 | 3.87117 | 1 | [0, 0] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_111658__173.json | 61.25 | missing | missing | missing | |
| 9423 | Apple-MacBook-Pro-M1 | pig_latinify | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 5 | 20240223_235832__711 | 0 | 0.0 | 30.8224 | 0 | [0, 470] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_235832__711.json | 0.0 | missing | missing | missing | |
| 9424 | Apple-MacBook-Pro-M1 | pig_latinify | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 5 | 20240223_235855__346 | 0 | 0.0 | 22.5572 | 0 | [0, 347] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_235855__346.json | 0.0 | missing | missing | missing | |
| 9425 | Apple-MacBook-Pro-M1 | pig_latinify | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 5 | 20240223_235920__893 | 0 | 0.0 | 25.4101 | 0 | [0, 387] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_235920__893.json | 0.0 | missing | missing | missing | |
| 9426 | Apple-MacBook-Pro-M1 | pig_latinify | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | true | 5 | 20240223_235945__676 | 0 | 0.0 | 24.2141 | 0 | [0, 370] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240223_235945__676.json | 50.0 | missing | missing | missing | |
| 9427 | Apple-MacBook-Pro-M1 | pig_latinify | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 5 | 20240224_000007__478 | 0 | 0.0 | 22.3841 | 0 | [0, 339] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240224_000007__478.json | 0.0 | missing | missing | missing | |
| 9428 | Apple-MacBook-Pro-M1 | pig_latinify | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240223_235447__698 | 0 | 0.0 | 19.0511 | 0 | [0, 289] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_235447__698.json | 25.0 | missing | missing | missing | |
| 9429 | Apple-MacBook-Pro-M1 | pig_latinify | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240223_235507__170 | 0 | 0.0 | 20.3198 | 0 | [0, 310] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_235507__170.json | 0.0 | missing | missing | missing | |
| 9430 | Apple-MacBook-Pro-M1 | pig_latinify | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240223_235527__108 | 0 | 0.0 | 20.0635 | 0 | [0, 310] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_235527__108.json | 0.0 | missing | missing | missing | |
| 9431 | Apple-MacBook-Pro-M1 | pig_latinify | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240223_235547__255 | 0 | 0.0 | 19.6131 | 0 | [0, 296] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_235547__255.json | 50.0 | missing | missing | missing | |
| 9432 | Apple-MacBook-Pro-M1 | pig_latinify | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240223_235606__106 | 0 | 0.0 | 18.6973 | 0 | [0, 286] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240223_235606__106.json | 0.0 | missing | missing | missing | |
| 9433 | Apple-MacBook-Pro-M1 | pig_latinify | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240223_235130__939 | 0 | 0.0 | 25.6331 | 0 | [0, 388] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_235130__939.json | 0.0 | missing | missing | missing | |
| 9434 | Apple-MacBook-Pro-M1 | pig_latinify | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240223_235150__148 | 0 | 0.0 | 19.914 | 0 | [0, 305] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_235150__148.json | 0.0 | missing | missing | missing | |
| 9435 | Apple-MacBook-Pro-M1 | pig_latinify | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240223_235213__247 | 0 | 0.0 | 23.0996 | 0 | [0, 352] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_235213__247.json | 25.0 | missing | missing | missing | |
| 9436 | Apple-MacBook-Pro-M1 | pig_latinify | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240223_235236__333 | 0 | 0.0 | 23.0584 | 4 | [0, 350] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_235236__333.json | 75.0 | missing | missing | missing | |
| 9437 | Apple-MacBook-Pro-M1 | pig_latinify | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240223_235306__451 | 0 | 0.0 | 29.6222 | 0 | [0, 450] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240223_235306__451.json | 0.0 | missing | missing | missing | |
| 9438 | Apple-MacBook-Pro-M1 | pig_latinify | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240224_000812__520 | 0 | 0.0 | 28.8209 | 0 | [0, 435] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240224_000812__520.json | 0.0 | missing | missing | missing | |
| 9439 | Apple-MacBook-Pro-M1 | pig_latinify | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240224_000847__352 | 1 | 0.0 | 35.5977 | 1 | [0, 540] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240224_000847__352.json | 61.25 | missing | missing | missing | |
| 9440 | Apple-MacBook-Pro-M1 | pig_latinify | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240224_000927__460 | 0 | 0.0 | 39.8017 | 0 | [0, 602] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240224_000927__460.json | 0.0 | missing | missing | missing | |
| 9441 | Apple-MacBook-Pro-M1 | pig_latinify | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240224_001001__273 | 0 | 0.0 | 33.6371 | 0 | [0, 509] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240224_001001__273.json | 25.0 | missing | missing | missing | |
| 9442 | Apple-MacBook-Pro-M1 | pig_latinify | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240224_001035__592 | 0 | 0.0 | 33.7642 | 0 | [0, 509] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240224_001035__592.json | 0.0 | missing | missing | missing | |
| 9443 | Apple-MacBook-Pro-M1 | pig_latinify | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240224_000308__960 | 0 | 0.0 | 28.8451 | 0 | [0, 435] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240224_000308__960.json | 0.0 | missing | missing | missing | |
| 9444 | Apple-MacBook-Pro-M1 | pig_latinify | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20240224_000335__478 | 0 | 0.0 | 26.8666 | 0 | [0, 407] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240224_000335__478.json | 25.0 | missing | missing | missing | |
| 9445 | Apple-MacBook-Pro-M1 | pig_latinify | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20240224_000405__951 | 0 | 0.0 | 29.9573 | 0 | [0, 455] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240224_000405__951.json | 25.0 | missing | missing | missing | |
| 9446 | Apple-MacBook-Pro-M1 | pig_latinify | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240224_000433__257 | 0 | 0.0 | 27.7856 | 0 | [0, 420] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240224_000433__257.json | 0.0 | missing | missing | missing | |
| 9447 | Apple-MacBook-Pro-M1 | pig_latinify | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240224_000503__230 | 0 | 0.0 | 29.9164 | 0 | [0, 447] | 0.13.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240224_000503__230.json | 0.0 | missing | missing | missing | |
| 9448 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 5 | 20231213_203759__960 | 0 | 0.0005545 | 12.9703 | 0 | [59, 350] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231213_203759__960.json | 0.0 | missing | missing | missing | |
| 9449 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 5 | 20231225_195043__587 | 0 | 0.0005365 | 5.70573 | 0 | [59, 338] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_195043__587.json | 0.0 | missing | missing | missing | |
| 9450 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 5 | 20231225_195048__992 | 0 | 0.000517 | 4.80925 | 0 | [59, 325] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_195048__992.json | 0.0 | missing | missing | missing | |
| 9451 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo--optim | AsIs | 1SHOT | false | false | 5 | 20231215_200250__901 | 0 | 0.0 | 7.12289 | 0 | [59, 311] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231215_200250__901.json | 0.0 | 0.5 | missing | 0.5 | |
| 9452 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | InJulia | 1SHOT | true | false | 5 | 20231213_203746__547 | 0 | 0.000766 | 11.5136 | 0 | [62, 490] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231213_203746__547.json | 25.0 | missing | missing | missing | |
| 9453 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231225_195031__265 | 1 | 0.00064 | 6.8573 | 2 | [62, 406] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_195031__265.json | 67.5 | missing | missing | missing | |
| 9454 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231225_195037__914 | 1 | 0.000556 | 5.74892 | 1 | [62, 350] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_195037__914.json | 61.25 | missing | missing | missing | |
| 9455 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | InJulia | 1SHOT | true | false | 5 | 20231227_202516__130 | 0 | 0.0006235 | 7.88685 | 0 | [62, 395] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_202516__130.json | 25.0 | missing | missing | missing | |
| 9456 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231227_202524__167 | 1 | 0.0006835 | 7.71314 | 1 | [62, 435] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_202524__167.json | 61.25 | missing | missing | missing | |
| 9457 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo--optim | InJulia | 1SHOT | true | true | 5 | 20231215_200243__170 | 0 | 0.0 | 8.49546 | 0 | [62, 414] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231215_200243__170.json | 50.0 | 0.5 | missing | 0.5 | |
| 9458 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231213_203734__728 | 0 | 0.0005 | 7.24773 | 0 | [97, 301] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231213_203734__728.json | 25.0 | missing | missing | missing | |
| 9459 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_195019__411 | 0 | 0.0005315 | 5.85033 | 0 | [97, 322] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_195019__411.json | 25.0 | missing | missing | missing | |
| 9460 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_195024__175 | 0 | 0.000425 | 5.34626 | 0 | [97, 251] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_195024__175.json | 25.0 | missing | missing | missing | |
| 9461 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_202503__718 | 0 | 0.0003995 | 4.14931 | 0 | [97, 234] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_202503__718.json | 0.0 | missing | missing | missing | |
| 9462 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_202508__779 | 2 | 0.000473 | 5.16033 | 2 | [97, 283] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_202508__779.json | 72.5 | missing | missing | missing | |
| 9463 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo--optim | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231215_200234__470 | 0 | 0.0 | 8.23331 | 0 | [97, 352] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231215_200234__470.json | 25.0 | 0.5 | missing | 0.5 | |
| 9464 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_203727__378 | 0 | 0.00028 | 3.61465 | 0 | [170, 130] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231213_203727__378.json | 0.0 | missing | missing | missing | |
| 9465 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_195009__950 | 0 | 0.000238 | 2.51829 | 0 | [170, 102] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_195009__950.json | 0.0 | missing | missing | missing | |
| 9466 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_195013__292 | 0 | 0.000427 | 4.03007 | 0 | [170, 228] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_195013__292.json | 0.0 | missing | missing | missing | |
| 9467 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_202449__885 | 0 | 0.0002275 | 1.97429 | 0 | [170, 95] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_202449__885.json | 0.0 | missing | missing | missing | |
| 9468 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_202458__489 | 4 | 0.0009325 | 9.51867 | 4 | [170, 565] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_202458__489.json | 95.0 | missing | missing | missing | |
| 9469 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_200226__399 | 3 | 0.0 | 7.68926 | 2 | [170, 343] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231215_200226__399.json | 77.5 | 0.5 | missing | 0.5 | |
| 9470 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231213_203805__258 | 0 | 0.0003985 | 3.81213 | 0 | [320, 159] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231213_203805__258.json | 0.0 | missing | missing | missing | |
| 9471 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_195056__668 | 0 | 0.0005125 | 3.2548 | 0 | [320, 235] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_195056__668.json | 0.0 | missing | missing | missing | |
| 9472 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_195101__773 | 0 | 0.0006415 | 4.73418 | 0 | [320, 321] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_195101__773.json | 50.0 | missing | missing | missing | |
| 9473 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_202544__403 | 0 | 0.0004705 | 3.38542 | 0 | [320, 207] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_202544__403.json | 0.0 | missing | missing | missing | |
| 9474 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_202546__552 | 0 | 0.000316 | 2.0442 | 0 | [320, 104] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_202546__552.json | 0.0 | missing | missing | missing | |
| 9475 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo--optim | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231215_200255__692 | 0 | 0.0 | 2.32162 | 0 | [320, 101] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231215_200255__692.json | 0.0 | 0.5 | missing | 0.5 | |
| 9476 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_203801__826 | 0 | 0.00029 | 2.56277 | 0 | [319, 87] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231213_203801__826.json | 0.0 | missing | missing | missing | |
| 9477 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_195050__606 | 0 | 0.0003215 | 2.20431 | 0 | [319, 108] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_195050__606.json | 0.0 | missing | missing | missing | |
| 9478 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_195053__655 | 0 | 0.0003635 | 2.8054 | 0 | [319, 136] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_195053__655.json | 0.0 | missing | missing | missing | |
| 9479 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_202539__851 | 2 | 0.001529 | 15.3263 | 4 | [319, 913] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_202539__851.json | 85.0 | missing | missing | missing | |
| 9480 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_202541__119 | 0 | 0.0002795 | 1.78094 | 0 | [319, 80] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_202541__119.json | 0.0 | missing | missing | missing | |
| 9481 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo--optim | JuliaRecapTask | 1SHOT | false | false | 5 | 20231215_200253__387 | 0 | 0.0 | 2.8069 | 0 | [319, 118] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231215_200253__387.json | 0.0 | 0.5 | missing | 0.5 | |
| 9482 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | false | 5 | 20240201_200610__832 | 0 | 0.0004555 | 2.17232 | 0 | [62, 283] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200610__832.json | 25.0 | missing | missing | missing | |
| 9483 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200612__777 | 3 | 0.0004945 | 2.19734 | 2 | [62, 309] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200612__777.json | 77.5 | missing | missing | missing | |
| 9484 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200615__917 | 2 | 0.000574 | 2.8974 | 2 | [62, 362] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200615__917.json | 72.5 | missing | missing | missing | |
| 9485 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | false | 5 | 20240201_200618__210 | 0 | 0.0005605 | 2.46064 | 0 | [62, 353] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200618__210.json | 25.0 | missing | missing | missing | |
| 9486 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | false | 5 | 20240201_200620__416 | 0 | 0.0005425 | 2.56351 | 0 | [62, 341] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200620__416.json | 25.0 | missing | missing | missing | |
| 9487 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200601__730 | 0 | 0.000302 | 1.47882 | 0 | [97, 169] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200601__730.json | 50.0 | missing | missing | missing | |
| 9488 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200603__201 | 0 | 0.0003125 | 1.56237 | 0 | [97, 176] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200603__201.json | 50.0 | missing | missing | missing | |
| 9489 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200605__764 | 0 | 0.000395 | 1.88486 | 0 | [97, 231] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200605__764.json | 50.0 | missing | missing | missing | |
| 9490 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200606__337 | 0 | 0.000314 | 1.31397 | 0 | [97, 177] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200606__337.json | 50.0 | missing | missing | missing | |
| 9491 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200608__669 | 3 | 0.0003305 | 1.61675 | 2 | [97, 188] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200608__669.json | 77.5 | missing | missing | missing | |
| 9492 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_200551__482 | 0 | 0.0003115 | 1.38324 | 0 | [170, 151] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200551__482.json | 25.0 | missing | missing | missing | |
| 9493 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_200554__863 | 0 | 0.000424 | 2.35617 | 0 | [170, 226] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200554__863.json | 25.0 | missing | missing | missing | |
| 9494 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_200556__282 | 0 | 0.0005125 | 2.01754 | 0 | [170, 285] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200556__282.json | 25.0 | missing | missing | missing | |
| 9495 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_200557__437 | 0 | 0.000283 | 1.26181 | 0 | [170, 132] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200557__437.json | 25.0 | missing | missing | missing | |
| 9496 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_200559__162 | 0 | 0.0004915 | 1.90035 | 0 | [170, 271] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200559__162.json | 25.0 | missing | missing | missing | |
| 9497 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200630__706 | 0 | 0.0003445 | 1.20477 | 0 | [320, 123] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200630__706.json | 0.0 | missing | missing | missing | |
| 9498 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200631__968 | 0 | 0.000235 | 0.68615 | 0 | [320, 50] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200631__968.json | 0.0 | missing | missing | missing | |
| 9499 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200632__502 | 0 | 0.0003175 | 0.941933 | 0 | [320, 105] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200632__502.json | 0.0 | missing | missing | missing | |
| 9500 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200633__344 | 0 | 0.0003025 | 0.839821 | 0 | [320, 95] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200633__344.json | 0.0 | missing | missing | missing | |
| 9501 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200634__142 | 0 | 0.0003115 | 0.947661 | 0 | [320, 101] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200634__142.json | 0.0 | missing | missing | missing | |
| 9502 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_200621__767 | 0 | 0.0003095 | 1.13696 | 0 | [319, 100] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200621__767.json | 0.0 | missing | missing | missing | |
| 9503 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200623__700 | 0 | 0.0005165 | 1.75505 | 0 | [319, 238] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200623__700.json | 50.0 | missing | missing | missing | |
| 9504 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200625__146 | 1 | 0.0004985 | 1.68813 | 1 | [319, 226] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200625__146.json | 61.25 | missing | missing | missing | |
| 9505 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200626__206 | 3 | 0.000437 | 1.36922 | 2 | [319, 185] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200626__206.json | 77.5 | missing | missing | missing | |
| 9506 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200629__494 | 4 | 0.000626 | 2.22548 | 4 | [319, 311] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200629__494.json | 95.0 | missing | missing | missing | |
| 9507 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 5 | 20231213_203822__402 | 0 | 0.000721 | 5.31502 | 0 | [59, 331] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231213_203822__402.json | 0.0 | missing | missing | missing | |
| 9508 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 5 | 20231225_224608__303 | 0 | 0.000699 | 3.42372 | 0 | [59, 320] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_224608__303.json | 0.0 | missing | missing | missing | |
| 9509 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 5 | 20231225_224612__804 | 0 | 0.000719 | 3.81635 | 0 | [59, 330] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_224612__804.json | 0.0 | missing | missing | missing | |
| 9510 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106--optim | AsIs | 1SHOT | false | false | 5 | 20231215_200310__101 | 0 | 0.0 | 4.67053 | 0 | [59, 273] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231215_200310__101.json | 0.0 | 0.9 | missing | 0.1 | |
| 9511 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | InJulia | 1SHOT | false | false | 5 | 20231213_203817__725 | 0 | 0.000722 | 6.67934 | 0 | [62, 330] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231213_203817__725.json | 0.0 | missing | missing | missing | |
| 9512 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231225_224600__831 | 0 | 0.000596 | 2.8333 | 0 | [62, 267] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_224600__831.json | 50.0 | missing | missing | missing | |
| 9513 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | false | 5 | 20231225_224604__256 | 0 | 0.000768 | 4.21987 | 0 | [62, 353] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_224604__256.json | 25.0 | missing | missing | missing | |
| 9514 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | false | 5 | 20231227_202602__594 | 0 | 0.00054 | 3.19187 | 0 | [62, 239] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_202602__594.json | 25.0 | missing | missing | missing | |
| 9515 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | false | 5 | 20231227_202607__785 | 0 | 0.00073 | 5.20835 | 0 | [62, 334] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_202607__785.json | 25.0 | missing | missing | missing | |
| 9516 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106--optim | InJulia | 1SHOT | true | true | 5 | 20231215_200305__348 | 0 | 0.0 | 4.23568 | 0 | [62, 251] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231215_200305__348.json | 50.0 | 0.9 | missing | 0.1 | |
| 9517 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_203810__807 | 1 | 0.000239 | 1.55075 | 1 | [97, 71] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231213_203810__807.json | 61.25 | missing | missing | missing | |
| 9518 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_224555__916 | 0 | 0.000267 | 1.29335 | 0 | [97, 85] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_224555__916.json | 0.0 | missing | missing | missing | |
| 9519 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_224557__406 | 0 | 0.000421 | 2.22597 | 0 | [97, 162] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_224557__406.json | 25.0 | missing | missing | missing | |
| 9520 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_202554__916 | 0 | 0.000419 | 2.72133 | 0 | [97, 161] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_202554__916.json | 25.0 | missing | missing | missing | |
| 9521 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_202559__652 | 0 | 0.000515 | 4.30472 | 0 | [97, 209] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_202559__652.json | 25.0 | missing | missing | missing | |
| 9522 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106--optim | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231215_200301__724 | 0 | 0.0 | 2.57165 | 0 | [97, 136] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231215_200301__724.json | 25.0 | 0.9 | missing | 0.1 | |
| 9523 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231213_203809__866 | 0 | 0.000492 | 3.34097 | 0 | [170, 161] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231213_203809__866.json | 25.0 | missing | missing | missing | |
| 9524 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_224553__597 | 0 | 0.00049 | 2.02675 | 0 | [170, 160] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_224553__597.json | 25.0 | missing | missing | missing | |
| 9525 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_202549__123 | 0 | 0.000438 | 2.13805 | 0 | [170, 134] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_202549__123.json | 25.0 | missing | missing | missing | |
| 9526 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_202552__972 | 0 | 0.00052 | 2.99276 | 0 | [170, 175] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_202552__972.json | 25.0 | missing | missing | missing | |
| 9527 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106--optim | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231215_200258__230 | 0 | 0.0 | 2.62805 | 0 | [170, 150] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231215_200258__230.json | 25.0 | 0.9 | missing | 0.1 | |
| 9528 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231213_203829__607 | 0 | 0.000528 | 2.46973 | 0 | [320, 104] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231213_203829__607.json | 0.0 | missing | missing | missing | |
| 9529 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_224616__237 | 0 | 0.000558 | 1.52826 | 0 | [320, 119] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_224616__237.json | 0.0 | missing | missing | missing | |
| 9530 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_224618__831 | 0 | 0.000504 | 1.32919 | 0 | [320, 92] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_224618__831.json | 0.0 | missing | missing | missing | |
| 9531 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_202617__552 | 0 | 0.00059 | 2.17564 | 0 | [320, 135] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_202617__552.json | 0.0 | missing | missing | missing | |
| 9532 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_202619__747 | 0 | 0.000494 | 1.6555 | 0 | [320, 87] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_202619__747.json | 0.0 | missing | missing | missing | |
| 9533 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106--optim | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231215_200316__970 | 0 | 0.0 | 2.0253 | 0 | [320, 108] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231215_200316__970.json | 0.0 | 0.9 | missing | 0.1 | |
| 9534 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_203826__853 | 3 | 0.000677 | 3.92629 | 2 | [319, 179] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231213_203826__853.json | 77.5 | missing | missing | missing | |
| 9535 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_224613__541 | 0 | 0.000487 | 1.24977 | 0 | [319, 84] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_224613__541.json | 0.0 | missing | missing | missing | |
| 9536 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_224615__419 | 0 | 0.000561 | 1.76367 | 0 | [319, 121] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_224615__419.json | 0.0 | missing | missing | missing | |
| 9537 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_202611__109 | 0 | 0.000649 | 3.28758 | 0 | [319, 165] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_202611__109.json | 50.0 | missing | missing | missing | |
| 9538 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_202615__235 | 2 | 0.000905 | 4.49942 | 1 | [319, 293] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_202615__235.json | 66.25 | missing | missing | missing | |
| 9539 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-3.5-turbo-1106--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_200313__179 | 3 | 0.0 | 3.48191 | 2 | [319, 180] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231215_200313__179.json | 77.5 | 0.9 | missing | 0.1 | |
| 9540 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_104645__363 | 2 | 0.02177 | 100.832 | 2 | [62, 705] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_104645__363.json | 72.5 | missing | missing | missing | |
| 9541 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_104737__718 | 3 | 0.02135 | 51.7338 | 2 | [62, 691] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_104737__718.json | 77.5 | missing | missing | missing | |
| 9542 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-0125-preview | InJulia | 1SHOT | true | false | 5 | 20240201_104824__352 | 0 | 0.01829 | 47.0268 | 0 | [62, 589] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_104824__352.json | 25.0 | missing | missing | missing | |
| 9543 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-0125-preview | InJulia | 1SHOT | true | false | 5 | 20240201_104902__927 | 0 | 0.01772 | 37.1665 | 0 | [62, 570] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_104902__927.json | 25.0 | missing | missing | missing | |
| 9544 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_104946__534 | 3 | 0.02027 | 43.8217 | 2 | [62, 655] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_104946__534.json | 77.5 | missing | missing | missing | |
| 9545 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_103946__559 | 2 | 0.0157 | 38.6702 | 2 | [97, 491] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_103946__559.json | 72.5 | missing | missing | missing | |
| 9546 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240201_103956__256 | 0 | 0.00565 | 10.4772 | 0 | [97, 156] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_103956__256.json | 25.0 | missing | missing | missing | |
| 9547 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_104037__395 | 2 | 0.01456 | 40.8346 | 2 | [97, 453] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_104037__395.json | 72.5 | missing | missing | missing | |
| 9548 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_104118__817 | 2 | 0.01648 | 40.5535 | 2 | [97, 517] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_104118__817.json | 72.5 | missing | missing | missing | |
| 9549 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_104132__527 | 3 | 0.00556 | 13.7202 | 2 | [97, 153] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_104132__527.json | 77.5 | missing | missing | missing | |
| 9550 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_103348__451 | 2 | 0.02 | 66.0896 | 2 | [170, 610] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_103348__451.json | 72.5 | missing | missing | missing | |
| 9551 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_103426__936 | 0 | 0.0155 | 38.2694 | 0 | [170, 460] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_103426__936.json | 25.0 | missing | missing | missing | |
| 9552 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_103515__769 | 2 | 0.01778 | 48.8229 | 2 | [170, 536] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_103515__769.json | 72.5 | missing | missing | missing | |
| 9553 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_103603__814 | 0 | 0.01952 | 48.2512 | 0 | [170, 594] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_103603__814.json | 25.0 | missing | missing | missing | |
| 9554 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_103658__979 | 3 | 0.02045 | 54.6875 | 4 | [170, 625] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_103658__979.json | 90.0 | missing | missing | missing | |
| 9555 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_110313__819 | 2 | 0.02081 | 50.6715 | 2 | [320, 587] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_110313__819.json | 72.5 | missing | missing | missing | |
| 9556 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_110405__461 | 0 | 0.02174 | 51.6291 | 1 | [320, 618] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_110405__461.json | 56.25 | missing | missing | missing | |
| 9557 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240201_110504__799 | 0 | 0.02159 | 59.1605 | 0 | [320, 613] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_110504__799.json | 25.0 | missing | missing | missing | |
| 9558 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240201_110604__894 | 0 | 0.02252 | 59.9861 | 0 | [320, 644] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_110604__894.json | 25.0 | missing | missing | missing | |
| 9559 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_110654__571 | 0 | 0.02396 | 49.8653 | 1 | [320, 692] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_110654__571.json | 56.25 | missing | missing | missing | |
| 9560 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | false | 5 | 20240201_105424__886 | 0 | 0.01891 | 40.0652 | 0 | [319, 524] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_105424__886.json | 25.0 | missing | missing | missing | |
| 9561 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | false | 5 | 20240201_105508__567 | 0 | 0.01909 | 43.5215 | 0 | [319, 530] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_105508__567.json | 25.0 | missing | missing | missing | |
| 9562 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | false | 5 | 20240201_105628__743 | 0 | 0.02158 | 80.4958 | 0 | [319, 613] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_105628__743.json | 25.0 | missing | missing | missing | |
| 9563 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_105709__533 | 3 | 0.01846 | 40.0213 | 4 | [319, 509] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_105709__533.json | 90.0 | missing | missing | missing | |
| 9564 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_105758__985 | 0 | 0.02167 | 49.7679 | 1 | [319, 616] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_105758__985.json | 56.25 | missing | missing | missing | |
| 9565 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 5 | 20231213_204137__843 | 0 | 0.02126 | 44.1789 | 0 | [59, 689] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231213_204137__843.json | 0.0 | missing | missing | missing | |
| 9566 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 5 | 20231225_224845__256 | 0 | 0.01706 | 18.6113 | 0 | [59, 549] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_224845__256.json | 0.0 | missing | missing | missing | |
| 9567 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 5 | 20231225_224908__676 | 0 | 0.02213 | 23.6322 | 0 | [59, 718] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_224908__676.json | 0.0 | missing | missing | missing | |
| 9568 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview--optim | AsIs | 1SHOT | false | false | 5 | 20231215_200604__439 | 0 | 0.0 | 44.3895 | 0 | [59, 561] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231215_200604__439.json | 0.0 | 0.1 | missing | 0.9 | |
| 9569 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231213_204052__849 | 1 | 0.0218 | 62.0306 | 1 | [62, 706] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231213_204052__849.json | 61.25 | missing | missing | missing | |
| 9570 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231225_224801__707 | 2 | 0.01958 | 30.1325 | 2 | [62, 632] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_224801__707.json | 72.5 | missing | missing | missing | |
| 9571 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | InJulia | 1SHOT | true | false | 5 | 20231225_224826__837 | 0 | 0.01967 | 25.231 | 0 | [62, 635] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_224826__837.json | 25.0 | missing | missing | missing | |
| 9572 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231227_202938__541 | 3 | 0.02342 | 72.1414 | 2 | [62, 760] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_202938__541.json | 77.5 | missing | missing | missing | |
| 9573 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231227_203038__771 | 2 | 0.01808 | 60.7021 | 2 | [62, 582] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_203038__771.json | 72.5 | missing | missing | missing | |
| 9574 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview--optim | InJulia | 1SHOT | true | true | 5 | 20231215_200519__802 | 3 | 0.0 | 47.4325 | 2 | [62, 548] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231215_200519__802.json | 77.5 | 0.1 | missing | 0.9 | |
| 9575 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_203950__508 | 2 | 0.01123 | 23.6208 | 2 | [97, 342] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231213_203950__508.json | 72.5 | missing | missing | missing | |
| 9576 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_224716__398 | 2 | 0.01621 | 18.3479 | 2 | [97, 508] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_224716__398.json | 72.5 | missing | missing | missing | |
| 9577 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_224730__406 | 3 | 0.01516 | 14.3286 | 2 | [97, 473] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_224730__406.json | 77.5 | missing | missing | missing | |
| 9578 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_202755__137 | 2 | 0.01294 | 38.0055 | 2 | [97, 399] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_202755__137.json | 72.5 | missing | missing | missing | |
| 9579 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_202825__592 | 0 | 0.01264 | 30.3615 | 0 | [97, 389] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_202825__592.json | 25.0 | missing | missing | missing | |
| 9580 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_200432__578 | 3 | 0.0 | 37.141 | 2 | [97, 444] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231215_200432__578.json | 77.5 | 0.1 | missing | 0.9 | |
| 9581 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_203926__626 | 2 | 0.01418 | 56.9517 | 2 | [170, 416] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231213_203926__626.json | 72.5 | missing | missing | missing | |
| 9582 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_224640__403 | 3 | 0.01727 | 22.1638 | 4 | [170, 519] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_224640__403.json | 90.0 | missing | missing | missing | |
| 9583 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_224657__981 | 3 | 0.01121 | 16.8536 | 4 | [170, 317] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_224657__981.json | 90.0 | missing | missing | missing | |
| 9584 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_202645__534 | 2 | 0.01574 | 25.9931 | 2 | [170, 468] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_202645__534.json | 72.5 | missing | missing | missing | |
| 9585 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_202717__922 | 0 | 0.01376 | 31.5747 | 0 | [170, 402] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_202717__922.json | 25.0 | missing | missing | missing | |
| 9586 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_200355__442 | 3 | 0.0 | 39.2754 | 2 | [170, 512] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231215_200355__442.json | 77.5 | 0.1 | missing | 0.9 | |
| 9587 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231213_204333__748 | 0 | 0.02234 | 63.6806 | 0 | [320, 638] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231213_204333__748.json | 25.0 | missing | missing | missing | |
| 9588 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_225024__646 | 0 | 0.0215 | 27.1873 | 0 | [320, 610] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_225024__646.json | 50.0 | missing | missing | missing | |
| 9589 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_225044__527 | 0 | 0.02219 | 19.4633 | 0 | [320, 633] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_225044__527.json | 25.0 | missing | missing | missing | |
| 9590 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_203305__322 | 3 | 0.0146 | 37.0117 | 2 | [320, 380] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_203305__322.json | 77.5 | missing | missing | missing | |
| 9591 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_203348__547 | 0 | 0.01982 | 42.5988 | 1 | [320, 554] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_203348__547.json | 56.25 | missing | missing | missing | |
| 9592 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_200727__685 | 0 | 0.0 | 50.1383 | 1 | [320, 445] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231215_200727__685.json | 56.25 | 0.1 | missing | 0.9 | |
| 9593 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_204229__400 | 3 | 0.01966 | 52.0707 | 4 | [319, 549] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231213_204229__400.json | 90.0 | missing | missing | missing | |
| 9594 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_224934__500 | 0 | 0.0226 | 25.2068 | 1 | [319, 647] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_224934__500.json | 56.25 | missing | missing | missing | |
| 9595 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_224957__964 | 1 | 0.01936 | 22.5297 | 1 | [319, 539] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_224957__964.json | 61.25 | missing | missing | missing | |
| 9596 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_203125__716 | 3 | 0.02173 | 46.4325 | 4 | [319, 618] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_203125__716.json | 90.0 | missing | missing | missing | |
| 9597 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_203228__924 | 0 | 0.02194 | 62.5066 | 0 | [319, 625] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_203228__924.json | 25.0 | missing | missing | missing | |
| 9598 | Apple-MacBook-Pro-M1 | pig_latinify | gpt-4-1106-preview--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_200637__283 | 0 | 0.0 | 33.0322 | 1 | [319, 421] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231215_200637__283.json | 56.25 | 0.1 | missing | 0.9 | |
| 9599 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | AsIs | 1SHOT | false | false | 5 | 20231214_080535__464 | 0 | 0.0 | 17.9416 | 0 | [53, 532] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__AsIs__1SHOT__20231214_080535__464.json | 0.0 | missing | missing | missing | |
| 9600 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | AsIs | 1SHOT | false | false | 5 | 20231225_122529__120 | 0 | 0.0 | 23.555 | 0 | [53, 688] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__AsIs__1SHOT__20231225_122529__120.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9601 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | AsIs | 1SHOT | false | false | 5 | 20231225_122545__808 | 0 | 0.0 | 15.9165 | 0 | [1, 488] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__AsIs__1SHOT__20231225_122545__808.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9602 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | InJulia | 1SHOT | false | false | 5 | 20231214_080517__995 | 0 | 0.0 | 19.034 | 0 | [70, 557] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__InJulia__1SHOT__20231214_080517__995.json | 0.0 | missing | missing | missing | |
| 9603 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | InJulia | 1SHOT | false | false | 5 | 20231225_122443__739 | 0 | 0.0 | 16.4281 | 0 | [70, 487] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__InJulia__1SHOT__20231225_122443__739.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9604 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | InJulia | 1SHOT | true | false | 5 | 20231225_122506__948 | 0 | 0.0 | 22.1989 | 0 | [1, 662] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__InJulia__1SHOT__20231225_122506__948.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9605 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | InJulia | 1SHOT | false | false | 5 | 20231227_013748__213 | 0 | 0.0 | 14.8003 | 0 | [70, 446] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__InJulia__1SHOT__20231227_013748__213.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9606 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_080458__473 | 0 | 0.0 | 16.9059 | 0 | [99, 487] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaExpertAsk__1SHOT__20231214_080458__473.json | 50.0 | missing | missing | missing | |
| 9607 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_122416__983 | 0 | 0.0 | 12.0738 | 0 | [99, 351] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_122416__983.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9608 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_122427__630 | 0 | 0.0 | 10.9186 | 0 | [1, 338] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_122427__630.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9609 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_013733__727 | 0 | 0.0 | 8.68223 | 0 | [99, 253] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaExpertAsk__1SHOT__20231227_013733__727.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9610 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_080441__516 | 1 | 0.0 | 19.7883 | 1 | [187, 534] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_080441__516.json | 61.25 | missing | missing | missing | |
| 9611 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_122351__855 | 0 | 0.0 | 20.2376 | 0 | [205, 407] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_122351__855.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9612 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_122404__302 | 0 | 0.0 | 13.2071 | 0 | [1, 394] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_122404__302.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9613 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_013724__814 | 0 | 0.0 | 25.7546 | 0 | [205, 569] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_013724__814.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9614 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_080609__775 | 0 | 0.0 | 14.5555 | 0 | [11, 401] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_080609__775.json | 25.0 | missing | missing | missing | |
| 9615 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_122656__550 | 0 | 0.0 | 22.8302 | 0 | [11, 615] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_122656__550.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9616 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_122729__119 | 0 | 0.0 | 33.0791 | 0 | [1, 866] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_122729__119.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9617 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_013840__225 | 0 | 0.0 | 23.4498 | 0 | [11, 637] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_013840__225.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9618 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_080555__922 | 0 | 0.0 | 19.5064 | 0 | [370, 450] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaRecapTask__1SHOT__20231214_080555__922.json | 0.0 | missing | missing | missing | |
| 9619 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_122609__122 | 0 | 0.0 | 23.9108 | 0 | [370, 565] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_122609__122.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9620 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_122633__408 | 0 | 0.0 | 23.4272 | 0 | [1, 636] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_122633__408.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9621 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_013816__879 | 0 | 0.0 | 28.2876 | 0 | [370, 680] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaRecapTask__1SHOT__20231227_013816__879.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9622 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | AsIs | 1SHOT | false | false | 5 | 20231214_081717__459 | 0 | 0.0 | 25.2472 | 0 | [53, 728] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__AsIs__1SHOT__20231214_081717__459.json | 0.0 | missing | missing | missing | |
| 9623 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | AsIs | 1SHOT | false | false | 5 | 20231225_131501__754 | 0 | 0.0 | 10.37 | 0 | [67, 299] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__AsIs__1SHOT__20231225_131501__754.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9624 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | InJulia | 1SHOT | true | false | 5 | 20231214_081652__286 | 0 | 0.0 | 13.5762 | 0 | [70, 404] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__InJulia__1SHOT__20231214_081652__286.json | 25.0 | missing | missing | missing | |
| 9625 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_125643__922 | 2 | 0.0 | 11.0357 | 2 | [70, 366] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__InJulia__1SHOT__20231225_125643__922.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9626 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | InJulia | 1SHOT | true | true | 5 | 20231227_015705__648 | 0 | 0.0 | 10.3978 | 2 | [70, 341] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__InJulia__1SHOT__20231227_015705__648.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9627 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_081638__251 | 0 | 0.0 | 14.5211 | 1 | [99, 420] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231214_081638__251.json | 56.25 | missing | missing | missing | |
| 9628 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_125621__660 | 0 | 0.0 | 11.0939 | 0 | [109, 353] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_125621__660.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9629 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_125632__409 | 0 | 0.0 | 10.2177 | 0 | [109, 329] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_125632__409.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9630 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_015654__585 | 0 | 0.0 | 12.6526 | 0 | [109, 408] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231227_015654__585.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9631 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_081623__485 | 0 | 0.0 | 17.3955 | 0 | [187, 468] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231214_081623__485.json | 0.0 | missing | missing | missing | |
| 9632 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_125553__638 | 0 | 0.0 | 23.4637 | 0 | [197, 540] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_125553__638.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9633 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_125610__355 | 0 | 0.0 | 17.1099 | 0 | [197, 524] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_125610__355.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9634 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_015641__304 | 1 | 0.0 | 16.3082 | 2 | [197, 330] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231227_015641__304.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9635 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_081805__812 | 0 | 0.0 | 31.5993 | 0 | [11, 824] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231214_081805__812.json | 25.0 | missing | missing | missing | |
| 9636 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_131537__638 | 1 | 0.0 | 15.1512 | 1 | [373, 431] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_131537__638.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9637 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_131553__805 | 1 | 0.0 | 15.8887 | 1 | [373, 459] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_131553__805.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9638 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_015727__982 | 0 | 0.0 | 11.8863 | 0 | [373, 333] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231227_015727__982.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9639 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_081734__962 | 0 | 0.0 | 16.6255 | 0 | [370, 374] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaRecapTask__1SHOT__20231214_081734__962.json | 0.0 | missing | missing | missing | |
| 9640 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_131513__546 | 0 | 0.0 | 11.9524 | 2 | [370, 339] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_131513__546.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9641 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_131522__120 | 0 | 0.0 | 8.58172 | 0 | [370, 219] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_131522__120.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9642 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_015715__177 | 0 | 0.0 | 10.0699 | 1 | [370, 276] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaRecapTask__1SHOT__20231227_015715__177.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9643 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_182130__277 | 0 | 0.0 | 23.7372 | 0 | [70, 447] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_182130__277.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9644 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_182154__361 | 0 | 0.0 | 23.9588 | 0 | [70, 460] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_182154__361.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9645 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_182210__858 | 0 | 0.0 | 16.2913 | 0 | [70, 311] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_182210__858.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9646 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_182021__262 | 0 | 0.0 | 21.6223 | 0 | [109, 412] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_182021__262.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9647 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_182046__955 | 0 | 0.0 | 24.9038 | 1 | [109, 468] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_182046__955.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9648 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_182106__800 | 0 | 0.0 | 20.1893 | 0 | [109, 383] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_182106__800.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9649 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_181931__170 | 0 | 0.0 | 17.6014 | 0 | [197, 305] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_181931__170.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9650 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_181944__445 | 0 | 0.0 | 12.6711 | 0 | [197, 223] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_181944__445.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9651 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_181959__101 | 0 | 0.0 | 15.0933 | 0 | [197, 271] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_181959__101.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9652 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_182337__434 | 0 | 0.0 | 14.3272 | 0 | [373, 239] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_182337__434.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9653 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_182401__335 | 0 | 0.0 | 24.2041 | 1 | [373, 410] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_182401__335.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9654 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_182421__468 | 0 | 0.0 | 20.0301 | 0 | [373, 348] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_182421__468.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9655 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_182237__431 | 0 | 0.0 | 26.8419 | 0 | [370, 472] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_182237__431.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9656 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_182305__300 | 0 | 0.0 | 27.5392 | 1 | [370, 484] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_182305__300.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9657 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_182322__761 | 0 | 0.0 | 16.7266 | 1 | [370, 281] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_182322__761.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9658 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | AsIs | 1SHOT | false | false | 5 | 20231213_204616__493 | 0 | 0.00458455 | 18.5607 | 0 | [65, 545] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__AsIs__1SHOT__20231213_204616__493.json | 0.0 | missing | missing | missing | |
| 9659 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | AsIs | 1SHOT | false | false | 5 | 20231225_225431__528 | 0 | 0.00385645 | 10.1185 | 0 | [65, 455] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__AsIs__1SHOT__20231225_225431__528.json | 0.0 | missing | missing | missing | |
| 9660 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | AsIs | 1SHOT | false | false | 5 | 20231225_225449__903 | 0 | 0.00660705 | 17.8045 | 0 | [65, 795] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__AsIs__1SHOT__20231225_225449__903.json | 0.0 | missing | missing | missing | |
| 9661 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium--optim | AsIs | 1SHOT | false | false | 5 | 20231215_200924__459 | 0 | 0.0 | 10.9689 | 0 | [65, 501] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__AsIs__1SHOT__20231215_200924__459.json | 0.0 | 0.9 | missing | 0.3 | |
| 9662 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | InJulia | 1SHOT | true | false | 5 | 20231213_204557__521 | 0 | 0.00401826 | 10.4887 | 0 | [68, 474] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__InJulia__1SHOT__20231213_204557__521.json | 25.0 | missing | missing | missing | |
| 9663 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | InJulia | 1SHOT | true | false | 5 | 20231225_225409__263 | 0 | 0.0037432 | 9.79394 | 0 | [68, 440] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__InJulia__1SHOT__20231225_225409__263.json | 25.0 | missing | missing | missing | |
| 9664 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | InJulia | 1SHOT | true | false | 5 | 20231225_225421__611 | 0 | 0.00426096 | 11.2211 | 0 | [68, 504] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__InJulia__1SHOT__20231225_225421__611.json | 25.0 | missing | missing | missing | |
| 9665 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | InJulia | 1SHOT | true | false | 5 | 20231227_204300__948 | 0 | 0.00405062 | 10.6713 | 0 | [68, 478] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__InJulia__1SHOT__20231227_204300__948.json | 25.0 | missing | missing | missing | |
| 9666 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | InJulia | 1SHOT | true | false | 5 | 20231227_204316__828 | 0 | 0.00528839 | 16.1065 | 0 | [68, 631] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__InJulia__1SHOT__20231227_204316__828.json | 25.0 | missing | missing | missing | |
| 9667 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium--optim | InJulia | 1SHOT | true | false | 5 | 20231215_200913__803 | 0 | 0.0 | 11.995 | 0 | [68, 547] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__InJulia__1SHOT__20231215_200913__803.json | 25.0 | 0.9 | missing | 0.3 | |
| 9668 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_204547__914 | 1 | 0.00475458 | 12.3323 | 1 | [107, 552] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231213_204547__914.json | 61.25 | missing | missing | missing | |
| 9669 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_225352__170 | 0 | 0.00371906 | 9.47996 | 0 | [107, 424] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_225352__170.json | 0.0 | missing | missing | missing | |
| 9670 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_225359__426 | 0 | 0.00278871 | 6.94726 | 0 | [107, 309] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_225359__426.json | 25.0 | missing | missing | missing | |
| 9671 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_203634__326 | 0 | 0.00354917 | 14.841 | 0 | [107, 403] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_203634__326.json | 25.0 | missing | missing | missing | |
| 9672 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_204249__874 | 0 | 0.00350872 | 14.9599 | 0 | [107, 398] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_204249__874.json | 25.0 | missing | missing | missing | |
| 9673 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_200901__948 | 2 | 0.0 | 7.5199 | 2 | [107, 342] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231215_200901__948.json | 72.5 | 0.9 | missing | 0.3 | |
| 9674 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231213_204534__350 | 0 | 0.00509735 | 25.8224 | 0 | [195, 565] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231213_204534__350.json | 25.0 | missing | missing | missing | |
| 9675 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_225331__370 | 0 | 0.00391621 | 17.0478 | 0 | [195, 419] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_225331__370.json | 25.0 | missing | missing | missing | |
| 9676 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_225342__532 | 1 | 0.00466049 | 11.4811 | 1 | [195, 511] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_225342__532.json | 61.25 | missing | missing | missing | |
| 9677 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_203605__577 | 0 | 0.00461195 | 11.9185 | 0 | [195, 505] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_203605__577.json | 25.0 | missing | missing | missing | |
| 9678 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_203619__742 | 2 | 0.0053805 | 13.7806 | 2 | [195, 600] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_203619__742.json | 72.5 | missing | missing | missing | |
| 9679 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium--optim | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231215_200853__110 | 0 | 0.0 | 13.7155 | 0 | [195, 482] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231215_200853__110.json | 25.0 | 0.9 | missing | 0.3 | |
| 9680 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231213_204723__212 | 0 | 0.00454242 | 27.9967 | 0 | [370, 438] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231213_204723__212.json | 25.0 | missing | missing | missing | |
| 9681 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_225616__660 | 0 | 0.00654065 | 32.7162 | 0 | [370, 685] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_225616__660.json | 25.0 | missing | missing | missing | |
| 9682 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_225642__763 | 0 | 0.00477703 | 26.4749 | 0 | [370, 467] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_225642__763.json | 25.0 | missing | missing | missing | |
| 9683 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_204352__182 | 0 | 0.00559412 | 13.0054 | 0 | [370, 568] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_204352__182.json | 25.0 | missing | missing | missing | |
| 9684 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_204403__131 | 0 | 0.00491456 | 11.0884 | 0 | [370, 484] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_204403__131.json | 25.0 | missing | missing | missing | |
| 9685 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_201015__541 | 0 | 0.0 | 30.0805 | 0 | [370, 605] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231215_201015__541.json | 50.0 | 0.9 | missing | 0.3 | |
| 9686 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | JuliaRecapTask | 1SHOT | true | false | 5 | 20231213_204654__422 | 0 | 0.00791594 | 38.0856 | 0 | [367, 856] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231213_204654__422.json | 25.0 | missing | missing | missing | |
| 9687 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_225514__862 | 0 | 0.00509253 | 25.4404 | 0 | [367, 507] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_225514__862.json | 0.0 | missing | missing | missing | |
| 9688 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_225543__324 | 0 | 0.00603097 | 28.4997 | 0 | [367, 623] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_225543__324.json | 25.0 | missing | missing | missing | |
| 9689 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_204327__117 | 0 | 0.00500354 | 11.3939 | 0 | [367, 496] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_204327__117.json | 25.0 | missing | missing | missing | |
| 9690 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_204339__473 | 0 | 0.00489837 | 11.0877 | 0 | [367, 483] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_204339__473.json | 25.0 | missing | missing | missing | |
| 9691 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-medium--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_200944__870 | 3 | 0.0 | 19.9336 | 4 | [367, 421] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231215_200944__870.json | 90.0 | 0.9 | missing | 0.3 | |
| 9692 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | AsIs | 1SHOT | false | false | 5 | 20231213_204454__282 | 0 | 0.000776021 | 6.86213 | 0 | [63, 379] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__AsIs__1SHOT__20231213_204454__282.json | 0.0 | missing | missing | missing | |
| 9693 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | AsIs | 1SHOT | false | false | 5 | 20231225_225233__603 | 0 | 0.00116402 | 12.2625 | 0 | [63, 579] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__AsIs__1SHOT__20231225_225233__603.json | 0.0 | missing | missing | missing | |
| 9694 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | AsIs | 1SHOT | false | false | 5 | 20231225_225239__155 | 0 | 0.000921521 | 6.15211 | 0 | [63, 454] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__AsIs__1SHOT__20231225_225239__155.json | 0.0 | missing | missing | missing | |
| 9695 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small--optim | AsIs | 1SHOT | false | false | 5 | 20231215_200824__631 | 0 | 0.0 | 6.6435 | 0 | [63, 509] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__AsIs__1SHOT__20231215_200824__631.json | 0.0 | 0.9 | missing | 0.3 | |
| 9696 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | InJulia | 1SHOT | true | false | 5 | 20231213_204447__358 | 0 | 0.00115432 | 7.64429 | 0 | [66, 573] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__InJulia__1SHOT__20231213_204447__358.json | 25.0 | missing | missing | missing | |
| 9697 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | InJulia | 1SHOT | true | false | 5 | 20231225_225214__802 | 0 | 0.000828402 | 5.57696 | 0 | [66, 405] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__InJulia__1SHOT__20231225_225214__802.json | 25.0 | missing | missing | missing | |
| 9698 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | InJulia | 1SHOT | true | false | 5 | 20231225_225220__917 | 0 | 0.000826462 | 5.68446 | 0 | [66, 404] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__InJulia__1SHOT__20231225_225220__917.json | 25.0 | missing | missing | missing | |
| 9699 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | InJulia | 1SHOT | true | false | 5 | 20231227_203514__947 | 0 | 0.000929282 | 6.21421 | 0 | [66, 457] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__InJulia__1SHOT__20231227_203514__947.json | 25.0 | missing | missing | missing | |
| 9700 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | InJulia | 1SHOT | true | false | 5 | 20231227_203520__984 | 0 | 0.000834222 | 5.45653 | 0 | [66, 408] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__InJulia__1SHOT__20231227_203520__984.json | 25.0 | missing | missing | missing | |
| 9701 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small--optim | InJulia | 1SHOT | true | false | 5 | 20231215_200817__183 | 0 | 0.0 | 5.3793 | 0 | [66, 404] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__InJulia__1SHOT__20231215_200817__183.json | 25.0 | 0.9 | missing | 0.3 | |
| 9702 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231213_204439__687 | 0 | 0.000631829 | 3.96852 | 0 | [107, 290] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231213_204439__687.json | 25.0 | missing | missing | missing | |
| 9703 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_225206__675 | 0 | 0.000608549 | 3.78935 | 0 | [107, 278] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_225206__675.json | 25.0 | missing | missing | missing | |
| 9704 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_225209__910 | 0 | 0.000449469 | 2.75704 | 0 | [107, 196] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_225209__910.json | 25.0 | missing | missing | missing | |
| 9705 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_203502__638 | 0 | 0.000773449 | 4.90322 | 0 | [107, 363] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_203502__638.json | 25.0 | missing | missing | missing | |
| 9706 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_203507__469 | 0 | 0.000757929 | 4.81955 | 0 | [107, 355] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_203507__469.json | 25.0 | missing | missing | missing | |
| 9707 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small--optim | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231215_200812__329 | 0 | 0.0 | 5.70287 | 0 | [107, 426] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231215_200812__329.json | 25.0 | 0.9 | missing | 0.3 | |
| 9708 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231213_204434__390 | 0 | 0.00103603 | 6.31987 | 0 | [195, 469] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231213_204434__390.json | 25.0 | missing | missing | missing | |
| 9709 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_225155__924 | 0 | 0.00121256 | 8.97073 | 0 | [195, 560] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_225155__924.json | 50.0 | missing | missing | missing | |
| 9710 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_225202__645 | 0 | 0.000968125 | 6.41057 | 0 | [195, 434] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_225202__645.json | 25.0 | missing | missing | missing | |
| 9711 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_203450__528 | 0 | 0.000364785 | 1.92674 | 0 | [195, 123] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_203450__528.json | 0.0 | missing | missing | missing | |
| 9712 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_203457__188 | 0 | 0.00123972 | 7.66228 | 0 | [195, 574] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_203457__188.json | 25.0 | missing | missing | missing | |
| 9713 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small--optim | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231215_200806__205 | 0 | 0.0 | 6.03443 | 0 | [195, 450] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231215_200806__205.json | 25.0 | 0.9 | missing | 0.3 | |
| 9714 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_204508__259 | 1 | 0.00131609 | 7.53538 | 1 | [373, 554] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231213_204508__259.json | 61.25 | missing | missing | missing | |
| 9715 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_225305__799 | 0 | 0.00163037 | 9.80362 | 0 | [373, 716] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_225305__799.json | 0.0 | missing | missing | missing | |
| 9716 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_225313__757 | 0 | 0.00144025 | 8.53909 | 0 | [373, 618] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_225313__757.json | 25.0 | missing | missing | missing | |
| 9717 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_203545__668 | 1 | 0.00117835 | 6.88249 | 1 | [373, 483] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_203545__668.json | 61.25 | missing | missing | missing | |
| 9718 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_203553__470 | 0 | 0.00135489 | 7.77747 | 0 | [373, 574] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_203553__470.json | 25.0 | missing | missing | missing | |
| 9719 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small--optim | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231215_200839__593 | 0 | 0.0 | 7.42582 | 0 | [373, 517] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231215_200839__593.json | 25.0 | 0.9 | missing | 0.3 | |
| 9720 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_204501__771 | 0 | 0.00125078 | 7.01824 | 0 | [371, 521] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231213_204501__771.json | 0.0 | missing | missing | missing | |
| 9721 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_225247__106 | 0 | 0.0013827 | 8.06475 | 0 | [371, 589] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_225247__106.json | 25.0 | missing | missing | missing | |
| 9722 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_225254__469 | 0 | 0.00124108 | 7.05822 | 0 | [371, 516] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_225254__469.json | 25.0 | missing | missing | missing | |
| 9723 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_203527__726 | 1 | 0.0012857 | 7.25134 | 1 | [371, 539] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_203527__726.json | 61.25 | missing | missing | missing | |
| 9724 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_203537__854 | 1 | 0.00148358 | 10.1715 | 1 | [371, 641] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_203537__854.json | 61.25 | missing | missing | missing | |
| 9725 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-small--optim | JuliaRecapTask | 1SHOT | true | false | 5 | 20231215_200831__272 | 0 | 0.0 | 6.83546 | 0 | [371, 504] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231215_200831__272.json | 25.0 | 0.9 | missing | 0.3 | |
| 9726 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231213_204404__593 | 0 | 0.000277902 | 10.1506 | 0 | [63, 594] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__AsIs__1SHOT__20231213_204404__593.json | 0.0 | missing | missing | missing | |
| 9727 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231225_225119__130 | 0 | 0.00032592 | 5.94126 | 0 | [63, 700] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__AsIs__1SHOT__20231225_225119__130.json | 0.0 | missing | missing | missing | |
| 9728 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231225_225124__808 | 0 | 0.000277449 | 5.02088 | 0 | [63, 593] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__AsIs__1SHOT__20231225_225124__808.json | 0.0 | missing | missing | missing | |
| 9729 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny--optim | AsIs | 1SHOT | false | false | 5 | 20231215_200748__819 | 0 | 0.0 | 6.85154 | 0 | [63, 835] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__AsIs__1SHOT__20231215_200748__819.json | 0.0 | 0.9 | missing | 0.3 | |
| 9730 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | InJulia | 1SHOT | true | false | 5 | 20231213_204354__297 | 0 | 0.000209466 | 5.89253 | 0 | [66, 442] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__InJulia__1SHOT__20231213_204354__297.json | 25.0 | missing | missing | missing | |
| 9731 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231225_225106__739 | 0 | 0.000208107 | 3.83036 | 0 | [66, 439] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__InJulia__1SHOT__20231225_225106__739.json | 50.0 | missing | missing | missing | |
| 9732 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231225_225113__469 | 0 | 0.000348537 | 6.27024 | 0 | [66, 749] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__InJulia__1SHOT__20231225_225113__469.json | 50.0 | missing | missing | missing | |
| 9733 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | InJulia | 1SHOT | true | false | 5 | 20231227_203416__479 | 0 | 0.000239817 | 4.55596 | 0 | [66, 509] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__InJulia__1SHOT__20231227_203416__479.json | 25.0 | missing | missing | missing | |
| 9734 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231227_203422__930 | 1 | 0.000310938 | 5.96222 | 1 | [66, 666] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__InJulia__1SHOT__20231227_203422__930.json | 61.25 | missing | missing | missing | |
| 9735 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny--optim | InJulia | 1SHOT | true | false | 5 | 20231215_200741__194 | 0 | 0.0 | 3.17931 | 0 | [66, 381] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__InJulia__1SHOT__20231215_200741__194.json | 25.0 | 0.9 | missing | 0.3 | |
| 9736 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231213_204348__903 | 0 | 0.000240574 | 8.193 | 0 | [107, 498] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231213_204348__903.json | 25.0 | missing | missing | missing | |
| 9737 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_225100__162 | 0 | 0.000193009 | 3.409 | 0 | [107, 393] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_225100__162.json | 0.0 | missing | missing | missing | |
| 9738 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_225103__370 | 0 | 0.000153145 | 2.67991 | 0 | [107, 305] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_225103__370.json | 25.0 | missing | missing | missing | |
| 9739 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_203407__229 | 0 | 0.000179419 | 3.31114 | 0 | [107, 363] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_203407__229.json | 0.0 | missing | missing | missing | |
| 9740 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_203411__894 | 0 | 0.000224266 | 4.1279 | 0 | [107, 462] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_203411__894.json | 25.0 | missing | missing | missing | |
| 9741 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny--optim | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231215_200737__460 | 0 | 0.0 | 4.23313 | 0 | [107, 504] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231215_200737__460.json | 25.0 | 0.9 | missing | 0.3 | |
| 9742 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231213_204340__231 | 0 | 0.000228432 | 6.05772 | 0 | [195, 444] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231213_204340__231.json | 25.0 | missing | missing | missing | |
| 9743 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_225053__324 | 0 | 0.00023568 | 8.71154 | 0 | [195, 460] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_225053__324.json | 25.0 | missing | missing | missing | |
| 9744 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_225056__797 | 0 | 0.000210765 | 3.57113 | 0 | [195, 405] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_225056__797.json | 0.0 | missing | missing | missing | |
| 9745 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_203359__297 | 0 | 0.000262407 | 10.5447 | 0 | [195, 519] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_203359__297.json | 50.0 | missing | missing | missing | |
| 9746 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_203404__763 | 0 | 0.000266937 | 4.76094 | 0 | [195, 529] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_203404__763.json | 0.0 | missing | missing | missing | |
| 9747 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny--optim | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231215_200733__867 | 0 | 0.0 | 6.0138 | 0 | [195, 478] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231215_200733__867.json | 25.0 | 0.9 | missing | 0.3 | |
| 9748 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_204428__576 | 0 | 0.000337157 | 10.7631 | 0 | [373, 629] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231213_204428__576.json | 50.0 | missing | missing | missing | |
| 9749 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_225140__845 | 0 | 0.000307259 | 4.87407 | 0 | [373, 563] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_225140__845.json | 50.0 | missing | missing | missing | |
| 9750 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_225146__823 | 0 | 0.000361166 | 5.90805 | 0 | [373, 682] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_225146__823.json | 50.0 | missing | missing | missing | |
| 9751 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_203440__318 | 0 | 0.000293669 | 4.91955 | 0 | [373, 533] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_203440__318.json | 50.0 | missing | missing | missing | |
| 9752 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_203447__186 | 1 | 0.00042368 | 7.59177 | 1 | [373, 820] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_203447__186.json | 61.25 | missing | missing | missing | |
| 9753 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny--optim | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231215_200800__700 | 0 | 0.0 | 5.15425 | 0 | [373, 609] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231215_200800__700.json | 25.0 | 0.9 | missing | 0.3 | |
| 9754 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | JuliaRecapTask | 1SHOT | true | false | 5 | 20231213_204417__762 | 0 | 0.000358168 | 12.6924 | 0 | [371, 676] | 0.10.0-DEV | 4 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231213_204417__762.json | 25.0 | missing | missing | missing | |
| 9755 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_225129__658 | 1 | 0.000330535 | 5.35313 | 4 | [371, 615] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_225129__658.json | 80.0 | missing | missing | missing | |
| 9756 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_225135__210 | 0 | 0.000330988 | 5.37487 | 0 | [371, 616] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_225135__210.json | 25.0 | missing | missing | missing | |
| 9757 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_203428__855 | 0 | 0.000334612 | 5.80577 | 0 | [371, 624] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_203428__855.json | 25.0 | missing | missing | missing | |
| 9758 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_203435__947 | 0 | 0.000367681 | 6.49938 | 0 | [371, 697] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_203435__947.json | 25.0 | missing | missing | missing | |
| 9759 | Apple-MacBook-Pro-M1 | pig_latinify | mistral-tiny--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_200754__476 | 0 | 0.0 | 6.6431 | 0 | [371, 783] | 0.10.0-DEV | 4 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231215_200754__476.json | 50.0 | 0.9 | missing | 0.3 | |
| 9760 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_135605__732 | 0 | 0.0 | 13.7496 | 0 | [62, 350] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_135605__732.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9761 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_135622__610 | 0 | 0.0 | 16.8249 | 0 | [62, 429] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_135622__610.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9762 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231225_135534__374 | 0 | 0.0 | 18.1549 | 0 | [65, 460] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_135534__374.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9763 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_135551__316 | 0 | 0.0 | 16.874 | 1 | [65, 427] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_135551__316.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9764 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_021310__573 | 0 | 0.0 | 15.078 | 0 | [65, 379] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_021310__573.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9765 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_135509__911 | 0 | 0.0 | 14.718 | 0 | [106, 366] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_135509__911.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9766 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_135516__616 | 0 | 0.0 | 6.83943 | 0 | [106, 162] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_135516__616.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9767 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_021255__329 | 0 | 0.0 | 13.2668 | 0 | [106, 327] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_021255__329.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9768 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_135439__515 | 0 | 0.0 | 25.7898 | 0 | [194, 488] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_135439__515.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9769 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_135454__679 | 0 | 0.0 | 14.9856 | 0 | [194, 357] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_135454__679.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9770 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_021242__989 | 1 | 0.0 | 19.6122 | 1 | [194, 337] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_021242__989.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9771 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_135825__870 | 0 | 0.0 | 83.5342 | 0 | [373, 1876] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_135825__870.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9772 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_135849__294 | 0 | 0.0 | 24.1658 | 0 | [373, 555] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_135849__294.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9773 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_021349__190 | 0 | 0.0 | 17.856 | 0 | [373, 398] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_021349__190.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9774 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_135641__286 | 0 | 0.0 | 19.4596 | 0 | [371, 440] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_135641__286.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9775 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_135701__646 | 0 | 0.0 | 19.9801 | 0 | [371, 453] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_135701__646.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9776 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_021331__324 | 0 | 0.0 | 20.7726 | 0 | [371, 469] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_021331__324.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9777 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_001149__803 | 0 | 0.0 | 19.7486 | 0 | [64, 628] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_001149__803.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9778 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | false | 5 | 20231228_001205__447 | 0 | 0.0 | 15.501 | 0 | [64, 496] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_001205__447.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9779 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | false | 5 | 20231228_001219__967 | 0 | 0.0 | 14.2462 | 0 | [64, 456] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_001219__967.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9780 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_001238__607 | 0 | 0.0 | 19.2741 | 0 | [64, 614] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_001238__607.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9781 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | false | 5 | 20231228_001255__370 | 0 | 0.0 | 16.5398 | 0 | [64, 529] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_001255__370.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9782 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231228_001031__376 | 0 | 0.0 | 15.696 | 0 | [105, 490] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_001031__376.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9783 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231228_001044__582 | 0 | 0.0 | 12.6011 | 0 | [105, 393] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_001044__582.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9784 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231228_001056__594 | 0 | 0.0 | 12.0663 | 0 | [105, 376] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_001056__594.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9785 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231228_001111__813 | 0 | 0.0 | 15.1373 | 0 | [105, 473] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_001111__813.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9786 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_001129__319 | 0 | 0.0 | 17.9975 | 0 | [105, 562] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_001129__319.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9787 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231228_000902__340 | 0 | 0.0 | 20.961 | 0 | [193, 615] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000902__340.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9788 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231228_000924__705 | 0 | 0.0 | 22.1956 | 0 | [193, 672] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000924__705.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9789 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231228_000946__318 | 0 | 0.0 | 21.7389 | 0 | [193, 657] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000946__318.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9790 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231228_001003__899 | 0 | 0.0 | 16.5697 | 0 | [193, 499] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_001003__899.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9791 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231228_001016__154 | 0 | 0.0 | 12.7873 | 0 | [193, 381] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_001016__154.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9792 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231228_001448__237 | 0 | 0.0 | 14.3268 | 0 | [372, 396] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_001448__237.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9793 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231228_001511__996 | 0 | 0.0 | 22.9715 | 0 | [372, 656] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_001511__996.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9794 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231228_001537__804 | 0 | 0.0 | 25.8778 | 0 | [372, 740] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_001537__804.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9795 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231228_001558__800 | 0 | 0.0 | 20.9281 | 0 | [372, 596] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_001558__800.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9796 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231228_001614__886 | 0 | 0.0 | 16.3526 | 0 | [372, 458] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_001614__886.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9797 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_001314__598 | 0 | 0.0 | 19.498 | 0 | [370, 553] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_001314__598.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9798 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_001333__499 | 0 | 0.0 | 19.1243 | 0 | [370, 542] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_001333__499.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9799 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231228_001355__548 | 0 | 0.0 | 21.9058 | 0 | [370, 624] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_001355__548.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9800 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_001415__479 | 0 | 0.0 | 19.2952 | 0 | [370, 547] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_001415__479.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9801 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_001434__518 | 0 | 0.0 | 18.7747 | 0 | [370, 531] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_001434__518.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9802 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_001946__324 | 1 | 0.0 | 22.0675 | 1 | [64, 557] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_001946__324.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9803 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231228_002010__143 | 0 | 0.0 | 23.3697 | 0 | [64, 589] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_002010__143.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9804 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231228_002030__930 | 0 | 0.0 | 20.2347 | 0 | [64, 511] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_002030__930.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9805 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231228_002053__497 | 0 | 0.0 | 22.7949 | 0 | [64, 575] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_002053__497.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9806 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231228_002109__527 | 0 | 0.0 | 15.6146 | 0 | [64, 395] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_002109__527.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9807 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231228_001817__774 | 0 | 0.0 | 13.2522 | 0 | [105, 325] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_001817__774.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9808 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_001840__602 | 0 | 0.0 | 22.6248 | 0 | [105, 560] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_001840__602.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9809 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231228_001855__815 | 0 | 0.0 | 15.3798 | 0 | [105, 379] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_001855__815.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9810 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231228_001909__707 | 0 | 0.0 | 13.6625 | 0 | [105, 336] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_001909__707.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9811 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231228_001924__730 | 0 | 0.0 | 14.9344 | 0 | [105, 368] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_001924__730.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9812 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231228_001634__251 | 0 | 0.0 | 19.5012 | 0 | [193, 446] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_001634__251.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9813 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231228_001653__153 | 0 | 0.0 | 19.1643 | 0 | [193, 458] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_001653__153.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9814 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231228_001718__757 | 0 | 0.0 | 24.6262 | 0 | [193, 592] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_001718__757.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9815 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231228_001738__492 | 0 | 0.0 | 20.2683 | 0 | [193, 485] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_001738__492.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9816 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231228_001804__973 | 0 | 0.0 | 25.8256 | 0 | [193, 621] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_001804__973.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9817 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_002329__776 | 0 | 0.0 | 21.6645 | 0 | [372, 489] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_002329__776.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9818 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231228_002357__922 | 0 | 0.0 | 27.1932 | 0 | [372, 621] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_002357__922.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9819 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231228_002425__734 | 0 | 0.0 | 28.7898 | 0 | [372, 659] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_002425__734.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9820 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231228_002449__194 | 0 | 0.0 | 23.1292 | 0 | [372, 524] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_002449__194.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9821 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231228_002512__718 | 0 | 0.0 | 22.944 | 0 | [372, 520] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_002512__718.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9822 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231228_002140__994 | 0 | 0.0 | 30.617 | 0 | [370, 702] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_002140__994.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9823 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231228_002157__991 | 0 | 0.0 | 16.9406 | 0 | [370, 374] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_002157__991.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9824 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231228_002213__100 | 0 | 0.0 | 15.5256 | 0 | [370, 339] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_002213__100.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9825 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231228_002245__625 | 0 | 0.0 | 32.0772 | 0 | [370, 736] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_002245__625.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9826 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231228_002307__438 | 0 | 0.0 | 21.9601 | 0 | [370, 496] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_002307__438.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9827 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231226_123307__136 | 0 | 0.0 | 37.4216 | 0 | [61, 693] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_123307__136.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9828 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231226_123332__381 | 0 | 0.0 | 24.764 | 0 | [61, 460] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_123332__381.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9829 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | false | 5 | 20231226_123157__297 | 0 | 0.0 | 26.9498 | 0 | [64, 501] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_123157__297.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9830 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | false | 5 | 20231226_123230__393 | 0 | 0.0 | 32.3207 | 0 | [64, 600] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_123230__393.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9831 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_021814__972 | 0 | 0.0 | 30.7879 | 0 | [64, 570] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231227_021814__972.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9832 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_123106__256 | 0 | 0.0 | 21.195 | 0 | [105, 386] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_123106__256.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9833 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231226_123131__564 | 0 | 0.0 | 24.3366 | 0 | [105, 445] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_123131__564.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9834 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_021743__183 | 0 | 0.0 | 18.5746 | 0 | [105, 337] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_021743__183.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9835 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_123014__706 | 0 | 0.0 | 32.9257 | 0 | [193, 589] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_123014__706.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9836 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_123045__785 | 0 | 0.0 | 30.6555 | 0 | [193, 547] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_123045__785.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9837 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_021724__352 | 0 | 0.0 | 36.5097 | 0 | [193, 490] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_021724__352.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9838 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_123508__174 | 0 | 0.0 | 29.1817 | 0 | [372, 488] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_123508__174.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9839 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231226_123539__947 | 0 | 0.0 | 31.0088 | 0 | [372, 518] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_123539__947.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9840 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_021926__504 | 0 | 0.0 | 30.9667 | 0 | [372, 528] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_021926__504.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9841 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231226_123402__759 | 0 | 0.0 | 29.6685 | 0 | [370, 505] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_123402__759.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9842 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_123438__756 | 3 | 0.0 | 36.2493 | 4 | [370, 616] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_123438__756.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9843 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_021855__262 | 0 | 0.0 | 41.6473 | 0 | [370, 717] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_021855__262.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9844 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231227_124453__654 | 0 | 0.0 | 69.6856 | 0 | [69, 414] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_124453__654.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9845 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_124546__290 | 0 | 0.0 | 52.6454 | 0 | [69, 311] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_124546__290.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9846 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_124702__361 | 1 | 0.0 | 76.0105 | 4 | [69, 452] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_124702__361.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9847 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_154522__749 | 0 | 0.0 | 104.359 | 0 | [69, 617] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_154522__749.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9848 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231227_154603__728 | 0 | 0.0 | 40.1094 | 0 | [69, 233] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_154603__728.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9849 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_124139__677 | 2 | 0.0 | 62.4177 | 2 | [108, 364] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_124139__677.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9850 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_124323__321 | 0 | 0.0 | 103.764 | 0 | [108, 610] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_124323__321.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9851 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_124343__710 | 0 | 0.0 | 19.945 | 0 | [108, 105] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_124343__710.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9852 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_154212__810 | 0 | 0.0 | 57.0933 | 0 | [108, 330] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_154212__810.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9853 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_154337__465 | 0 | 0.0 | 85.6078 | 0 | [108, 500] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_154337__465.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9854 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_123845__735 | 0 | 0.0 | 86.748 | 0 | [197, 461] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_123845__735.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9855 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_123943__701 | 1 | 0.0 | 56.3193 | 1 | [197, 311] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_123943__701.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9856 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_124037__842 | 0 | 0.0 | 53.6694 | 0 | [197, 295] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_124037__842.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9857 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_153945__796 | 1 | 0.0 | 81.1545 | 1 | [197, 456] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_153945__796.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9858 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_154114__227 | 0 | 0.0 | 88.179 | 0 | [197, 497] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_154114__227.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9859 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_125408__285 | 0 | 0.0 | 67.0015 | 0 | [382, 324] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_125408__285.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9860 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_125419__358 | 0 | 0.0 | 10.676 | 0 | [382, 5] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_125419__358.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9861 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_125614__566 | 0 | 0.0 | 114.857 | 2 | [382, 587] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_125614__566.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9862 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_154818__634 | 0 | 0.0 | 10.5808 | 0 | [382, 5] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_154818__634.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9863 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_125118__280 | 0 | 0.0 | 106.0 | 0 | [380, 566] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_125118__280.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9864 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_125301__203 | 0 | 0.0 | 102.401 | 0 | [380, 522] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_125301__203.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9865 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_154703__260 | 0 | 0.0 | 60.8068 | 1 | [380, 304] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_154703__260.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9866 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_154807__995 | 0 | 0.0 | 63.5109 | 0 | [380, 320] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_154807__995.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9867 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_140101__535 | 0 | 0.0 | 19.2421 | 0 | [70, 486] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231225_140101__535.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9868 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_140114__654 | 0 | 0.0 | 12.9118 | 0 | [70, 325] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231225_140114__654.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9869 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231225_140027__751 | 0 | 0.0 | 14.7899 | 0 | [73, 373] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_140027__751.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9870 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231225_140041__555 | 0 | 0.0 | 13.6465 | 0 | [73, 343] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_140041__555.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9871 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_021444__549 | 0 | 0.0 | 16.425 | 0 | [73, 412] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231227_021444__549.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9872 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_135953__691 | 0 | 0.0 | 22.0185 | 0 | [114, 549] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_135953__691.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9873 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_140012__160 | 1 | 0.0 | 19.0662 | 2 | [114, 475] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_140012__160.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9874 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_021427__571 | 0 | 0.0 | 11.942 | 0 | [114, 292] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_021427__571.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9875 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_135913__290 | 0 | 0.0 | 24.036 | 0 | [202, 424] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_135913__290.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9876 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_135931__848 | 0 | 0.0 | 17.8621 | 0 | [202, 428] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_135931__848.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9877 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_021415__573 | 0 | 0.0 | 25.2061 | 0 | [202, 461] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_021415__573.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9878 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_140214__724 | 0 | 0.0 | 16.7107 | 0 | [381, 371] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_140214__724.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9879 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_140239__175 | 0 | 0.0 | 24.9271 | 0 | [381, 571] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_140239__175.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9880 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_021530__141 | 0 | 0.0 | 22.7108 | 0 | [381, 515] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_021530__141.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9881 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_140137__270 | 0 | 0.0 | 23.8317 | 0 | [379, 545] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_140137__270.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9882 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_140157__205 | 0 | 0.0 | 20.0822 | 0 | [379, 454] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_140157__205.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9883 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_021507__783 | 0 | 0.0 | 23.0778 | 0 | [379, 524] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_021507__783.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9884 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231214_080723__822 | 0 | 0.0 | 17.6078 | 0 | [53, 523] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231214_080723__822.json | 0.0 | missing | missing | missing | |
| 9885 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231225_122850__518 | 0 | 0.0 | 11.0492 | 0 | [68, 358] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231225_122850__518.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9886 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231225_122902__942 | 0 | 0.0 | 11.6989 | 0 | [68, 379] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231225_122902__942.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9887 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | InJulia | 1SHOT | false | false | 5 | 20231214_080705__434 | 0 | 0.0 | 17.7645 | 0 | [70, 523] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231214_080705__434.json | 0.0 | missing | missing | missing | |
| 9888 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | InJulia | 1SHOT | true | false | 5 | 20231225_122830__431 | 0 | 0.0 | 9.95201 | 0 | [71, 322] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_122830__431.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9889 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | InJulia | 1SHOT | true | false | 5 | 20231225_122839__519 | 0 | 0.0 | 9.12528 | 0 | [71, 294] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_122839__519.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9890 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | InJulia | 1SHOT | true | false | 5 | 20231227_013921__812 | 0 | 0.0 | 14.5149 | 0 | [71, 468] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231227_013921__812.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9891 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_080648__302 | 0 | 0.0 | 13.7355 | 0 | [99, 398] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231214_080648__302.json | 25.0 | missing | missing | missing | |
| 9892 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_122810__422 | 0 | 0.0 | 8.84315 | 2 | [112, 279] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_122810__422.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9893 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_122820__161 | 0 | 0.0 | 9.99183 | 0 | [112, 316] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_122820__161.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9894 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_013906__634 | 1 | 0.0 | 10.1038 | 2 | [112, 318] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231227_013906__634.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9895 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_080634__525 | 0 | 0.0 | 24.5301 | 0 | [187, 659] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231214_080634__525.json | 0.0 | missing | missing | missing | |
| 9896 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_122749__683 | 0 | 0.0 | 20.469 | 0 | [200, 470] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_122749__683.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9897 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_122801__716 | 0 | 0.0 | 11.3589 | 0 | [200, 344] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_122801__716.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9898 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_013856__307 | 0 | 0.0 | 16.051 | 0 | [200, 334] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231227_013856__307.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9899 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_080820__640 | 0 | 0.0 | 29.216 | 0 | [11, 767] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231214_080820__640.json | 0.0 | missing | missing | missing | |
| 9900 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_123004__554 | 0 | 0.0 | 18.695 | 0 | [379, 540] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_123004__554.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9901 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_123021__579 | 0 | 0.0 | 16.4639 | 0 | [379, 472] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_123021__579.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9902 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_013953__802 | 0 | 0.0 | 11.5938 | 0 | [379, 318] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231227_013953__802.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9903 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_080750__534 | 0 | 0.0 | 27.2855 | 0 | [370, 646] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231214_080750__534.json | 0.0 | missing | missing | missing | |
| 9904 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_122920__425 | 0 | 0.0 | 18.2473 | 0 | [377, 527] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_122920__425.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9905 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_122946__954 | 0 | 0.0 | 25.3982 | 0 | [377, 742] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_122946__954.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9906 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_013941__393 | 0 | 0.0 | 20.5509 | 0 | [377, 593] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231227_013941__393.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9907 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231214_082108__980 | 0 | 0.0 | 12.4404 | 0 | [53, 376] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__AsIs__1SHOT__20231214_082108__980.json | 0.0 | missing | missing | missing | |
| 9908 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231225_132035__255 | 0 | 0.0 | 13.3919 | 0 | [70, 241] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__AsIs__1SHOT__20231225_132035__255.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9909 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231225_132038__622 | 0 | 0.0 | 3.22546 | 0 | [70, 47] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__AsIs__1SHOT__20231225_132038__622.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9910 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | InJulia | 1SHOT | true | false | 5 | 20231214_082056__322 | 0 | 0.0 | 13.8015 | 0 | [70, 410] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__InJulia__1SHOT__20231214_082056__322.json | 25.0 | missing | missing | missing | |
| 9911 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_132003__532 | 0 | 0.0 | 6.43752 | 0 | [73, 109] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__InJulia__1SHOT__20231225_132003__532.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9912 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_132021__663 | 0 | 0.0 | 18.6527 | 0 | [73, 340] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__InJulia__1SHOT__20231225_132021__663.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9913 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231227_015854__343 | 0 | 0.0 | 3.75015 | 0 | [73, 57] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__InJulia__1SHOT__20231227_015854__343.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9914 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_082042__155 | 0 | 0.0 | 15.77 | 0 | [99, 454] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231214_082042__155.json | 50.0 | missing | missing | missing | |
| 9915 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_131940__532 | 0 | 0.0 | 16.3154 | 0 | [112, 291] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_131940__532.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9916 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_131956__508 | 0 | 0.0 | 16.203 | 0 | [112, 289] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_131956__508.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9917 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_015850__363 | 0 | 0.0 | 2.84848 | 0 | [112, 35] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231227_015850__363.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9918 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_082026__383 | 0 | 0.0 | 23.0893 | 0 | [187, 622] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231214_082026__383.json | 0.0 | missing | missing | missing | |
| 9919 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_131856__753 | 0 | 0.0 | 47.7254 | 0 | [200, 666] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_131856__753.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9920 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_131924__165 | 0 | 0.0 | 27.2942 | 0 | [200, 472] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_131924__165.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9921 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_015847__160 | 0 | 0.0 | 28.1188 | 0 | [200, 324] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231227_015847__160.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9922 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_082230__141 | 0 | 0.0 | 39.8219 | 0 | [11, 1011] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231214_082230__141.json | 25.0 | missing | missing | missing | |
| 9923 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_132156__787 | 0 | 0.0 | 15.9308 | 0 | [376, 236] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_132156__787.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9924 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_132238__753 | 0 | 0.0 | 42.3446 | 0 | [376, 693] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_132238__753.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9925 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_015953__599 | 0 | 0.0 | 5.01372 | 0 | [376, 36] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231227_015953__599.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9926 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_082150__751 | 0 | 0.0 | 41.691 | 0 | [370, 984] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231214_082150__751.json | 25.0 | missing | missing | missing | |
| 9927 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_132054__879 | 0 | 0.0 | 15.7177 | 0 | [373, 232] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_132054__879.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9928 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_132140__359 | 0 | 0.0 | 45.8638 | 0 | [373, 752] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_132140__359.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9929 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_015948__418 | 0 | 0.0 | 53.7484 | 0 | [373, 877] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231227_015948__418.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9930 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231225_140450__570 | 0 | 0.0 | 9.88114 | 0 | [60, 387] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231225_140450__570.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9931 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231225_140502__904 | 0 | 0.0 | 12.3523 | 0 | [60, 480] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231225_140502__904.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9932 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_140421__897 | 0 | 0.0 | 31.824 | 0 | [63, 1160] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_140421__897.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9933 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_140440__944 | 0 | 0.0 | 18.955 | 0 | [63, 722] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_140440__944.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9934 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_021622__312 | 0 | 0.0 | 36.9302 | 0 | [63, 1314] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231227_021622__312.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9935 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_140318__388 | 0 | 0.0 | 11.0023 | 0 | [100, 420] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_140318__388.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9936 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_140349__476 | 0 | 0.0 | 30.9699 | 0 | [100, 1119] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_140349__476.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9937 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_021545__722 | 0 | 0.0 | 10.3744 | 0 | [100, 394] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_021545__722.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9938 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_140305__636 | 0 | 0.0 | 25.9261 | 0 | [187, 812] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_140305__636.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9939 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_140307__982 | 0 | 0.0 | 1.66878 | 0 | [187, 45] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_140307__982.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9940 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_021535__171 | 0 | 0.0 | 4.90598 | 0 | [187, 36] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_021535__171.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9941 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_140607__579 | 0 | 0.0 | 20.9012 | 0 | [352, 715] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_140607__579.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9942 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_140610__883 | 0 | 0.0 | 3.17258 | 0 | [352, 80] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_140610__883.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9943 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_021648__301 | 0 | 0.0 | 12.7265 | 0 | [352, 430] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_021648__301.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9944 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_140531__619 | 0 | 0.0 | 28.9073 | 0 | [349, 978] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_140531__619.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9945 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_140546__576 | 0 | 0.0 | 15.3841 | 0 | [349, 527] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_140546__576.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9946 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_021635__476 | 0 | 0.0 | 13.0256 | 0 | [349, 441] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_021635__476.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9947 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231214_082336__533 | 0 | 0.0 | 18.4904 | 0 | [53, 547] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231214_082336__533.json | 0.0 | missing | missing | missing | |
| 9948 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231225_132847__103 | 0 | 0.0 | 45.478 | 0 | [78, 355] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_132847__103.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9949 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231225_132932__757 | 0 | 0.0 | 45.4516 | 0 | [78, 355] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_132932__757.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9950 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | InJulia | 1SHOT | true | false | 5 | 20231214_082317__458 | 0 | 0.0 | 13.1889 | 0 | [70, 392] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231214_082317__458.json | 25.0 | missing | missing | missing | |
| 9951 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | InJulia | 1SHOT | true | false | 5 | 20231225_132708__917 | 0 | 0.0 | 62.7314 | 0 | [81, 492] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_132708__917.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9952 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_132801__488 | 1 | 0.0 | 52.7308 | 4 | [81, 413] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_132801__488.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9953 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | InJulia | 1SHOT | true | false | 5 | 20231227_020250__865 | 0 | 0.0 | 45.0245 | 0 | [81, 345] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231227_020250__865.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9954 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_082304__370 | 0 | 0.0 | 14.6784 | 0 | [99, 423] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231214_082304__370.json | 0.0 | missing | missing | missing | |
| 9955 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_132510__441 | 2 | 0.0 | 43.5223 | 2 | [120, 333] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_132510__441.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9956 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_132605__301 | 0 | 0.0 | 54.7192 | 0 | [120, 422] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_132605__301.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9957 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_020204__927 | 3 | 0.0 | 54.7475 | 4 | [120, 417] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231227_020204__927.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9958 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_082249__196 | 0 | 0.0 | 19.2682 | 0 | [187, 518] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_082249__196.json | 25.0 | missing | missing | missing | |
| 9959 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_132329__447 | 2 | 0.0 | 50.7477 | 2 | [208, 197] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_132329__447.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9960 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_132426__391 | 2 | 0.0 | 57.2676 | 2 | [208, 424] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_132426__391.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9961 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_020109__494 | 0 | 0.0 | 76.3443 | 0 | [208, 409] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_020109__494.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9962 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_082410__421 | 0 | 0.0 | 9.46659 | 0 | [11, 264] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_082410__421.json | 0.0 | missing | missing | missing | |
| 9963 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_133247__654 | 2 | 0.0 | 53.572 | 2 | [384, 361] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_133247__654.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9964 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_133344__452 | 0 | 0.0 | 57.5674 | 0 | [384, 392] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_133344__452.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9965 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_020448__436 | 2 | 0.0 | 57.8705 | 2 | [384, 391] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_020448__436.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9966 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_082400__618 | 0 | 0.0 | 24.6635 | 4 | [370, 581] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231214_082400__618.json | 75.0 | missing | missing | missing | |
| 9967 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_133034__780 | 0 | 0.0 | 61.7081 | 0 | [381, 424] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_133034__780.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9968 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_133153__836 | 1 | 0.0 | 78.833 | 1 | [381, 555] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_133153__836.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9969 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_020350__474 | 1 | 0.0 | 59.9814 | 1 | [381, 408] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231227_020350__474.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9970 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_135157__657 | 0 | 0.0 | 32.5703 | 0 | [70, 554] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231225_135157__657.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9971 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_135221__648 | 0 | 0.0 | 23.3516 | 0 | [70, 397] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231225_135221__648.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9972 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231225_135056__605 | 0 | 0.0 | 24.4731 | 0 | [73, 416] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_135056__605.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9973 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_135124__417 | 1 | 0.0 | 27.5513 | 1 | [73, 469] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_135124__417.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 9974 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231227_021130__283 | 0 | 0.0 | 28.0632 | 0 | [73, 476] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231227_021130__283.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9975 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_135002__442 | 0 | 0.0 | 13.8568 | 0 | [114, 227] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_135002__442.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9976 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_135032__600 | 0 | 0.0 | 29.479 | 0 | [114, 495] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_135032__600.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9977 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_021102__810 | 0 | 0.0 | 29.5296 | 0 | [114, 494] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_021102__810.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9978 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_134926__616 | 0 | 0.0 | 36.9362 | 0 | [202, 453] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_134926__616.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9979 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_134949__995 | 0 | 0.0 | 22.4764 | 0 | [202, 360] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_134949__995.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9980 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_021033__639 | 0 | 0.0 | 27.3627 | 0 | [202, 293] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_021033__639.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9981 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_135331__162 | 0 | 0.0 | 24.9466 | 0 | [381, 373] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_135331__162.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9982 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_135413__350 | 0 | 0.0 | 41.3785 | 0 | [381, 641] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_135413__350.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9983 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_021222__286 | 0 | 0.0 | 22.84 | 0 | [381, 337] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_021222__286.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9984 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_135244__159 | 0 | 0.0 | 23.9556 | 0 | [379, 357] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_135244__159.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9985 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_135306__568 | 0 | 0.0 | 21.7273 | 0 | [379, 320] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_135306__568.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9986 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_021159__626 | 0 | 0.0 | 28.7177 | 0 | [379, 434] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_021159__626.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9987 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231214_081912__858 | 0 | 0.0 | 13.8093 | 0 | [53, 415] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__AsIs__1SHOT__20231214_081912__858.json | 0.0 | missing | missing | missing | |
| 9988 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231225_131701__788 | 0 | 0.0 | 7.28656 | 0 | [72, 415] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_131701__788.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9989 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231225_131717__923 | 0 | 0.0 | 16.4277 | 0 | [72, 883] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_131717__923.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9990 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | InJulia | 1SHOT | true | false | 5 | 20231214_081858__617 | 0 | 0.0 | 14.8352 | 0 | [70, 441] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__InJulia__1SHOT__20231214_081858__617.json | 25.0 | missing | missing | missing | |
| 9991 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231225_131644__856 | 0 | 0.0 | 7.91779 | 0 | [75, 450] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_131644__856.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9992 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231225_131654__766 | 0 | 0.0 | 9.14437 | 0 | [75, 516] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_131654__766.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9993 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231227_015800__398 | 0 | 0.0 | 10.3343 | 0 | [75, 573] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__InJulia__1SHOT__20231227_015800__398.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9994 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_081843__458 | 0 | 0.0 | 12.5548 | 0 | [99, 362] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231214_081843__458.json | 0.0 | missing | missing | missing | |
| 9995 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_131627__100 | 0 | 0.0 | 8.09751 | 0 | [112, 451] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_131627__100.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9996 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_131637__198 | 0 | 0.0 | 9.43657 | 0 | [112, 523] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_131637__198.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9997 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_015750__702 | 0 | 0.0 | 10.4733 | 0 | [112, 570] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231227_015750__702.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 9998 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_081830__586 | 0 | 0.0 | 24.7267 | 0 | [187, 663] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231214_081830__586.json | 50.0 | missing | missing | missing | |
| 9999 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_131612__431 | 0 | 0.0 | 18.1742 | 0 | [196, 788] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_131612__431.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10000 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_131619__542 | 0 | 0.0 | 7.37589 | 0 | [196, 389] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_131619__542.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10001 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_015739__327 | 0 | 0.0 | 12.5823 | 0 | [196, 515] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231227_015739__327.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10002 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_082003__715 | 0 | 0.0 | 25.9145 | 0 | [11, 688] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231214_082003__715.json | 50.0 | missing | missing | missing | |
| 10003 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_131751__851 | 0 | 0.0 | 13.9418 | 0 | [362, 666] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_131751__851.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10004 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_131809__403 | 0 | 0.0 | 17.8849 | 0 | [362, 847] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_131809__403.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10005 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_015819__825 | 0 | 0.0 | 12.0121 | 0 | [362, 568] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231227_015819__825.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10006 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_081937__240 | 0 | 0.0 | 25.5711 | 0 | [370, 603] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231214_081937__240.json | 0.0 | missing | missing | missing | |
| 10007 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_131728__333 | 0 | 0.0 | 10.3448 | 0 | [360, 492] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_131728__333.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10008 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_131737__616 | 0 | 0.0 | 8.99895 | 0 | [360, 427] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_131737__616.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10009 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_015807__390 | 0 | 0.0 | 7.26255 | 0 | [360, 334] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231227_015807__390.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10010 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231214_080927__889 | 0 | 0.0 | 23.3717 | 0 | [53, 681] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__AsIs__1SHOT__20231214_080927__889.json | 0.0 | missing | missing | missing | |
| 10011 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231225_123153__637 | 0 | 0.0 | 11.2183 | 0 | [70, 362] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__AsIs__1SHOT__20231225_123153__637.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10012 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231225_123202__913 | 0 | 0.0 | 8.6586 | 0 | [70, 278] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__AsIs__1SHOT__20231225_123202__913.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10013 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | InJulia | 1SHOT | true | false | 5 | 20231214_080904__504 | 0 | 0.0 | 12.1193 | 0 | [70, 361] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__InJulia__1SHOT__20231214_080904__504.json | 25.0 | missing | missing | missing | |
| 10014 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | InJulia | 1SHOT | true | false | 5 | 20231225_123132__737 | 0 | 0.0 | 15.6397 | 0 | [73, 507] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_123132__737.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10015 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | InJulia | 1SHOT | true | false | 5 | 20231225_123142__924 | 0 | 0.0 | 9.73797 | 0 | [73, 314] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_123142__924.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10016 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231227_014042__401 | 0 | 0.0 | 12.1524 | 0 | [73, 391] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__InJulia__1SHOT__20231227_014042__401.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10017 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_080852__748 | 0 | 0.0 | 10.807 | 0 | [99, 312] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231214_080852__748.json | 0.0 | missing | missing | missing | |
| 10018 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_123104__817 | 0 | 0.0 | 14.284 | 0 | [114, 455] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_123104__817.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10019 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_123116__383 | 0 | 0.0 | 12.4757 | 0 | [114, 397] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_123116__383.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10020 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_014029__246 | 0 | 0.0 | 17.399 | 0 | [114, 551] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231227_014029__246.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10021 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_080841__522 | 0 | 0.0 | 21.3463 | 0 | [187, 576] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231214_080841__522.json | 0.0 | missing | missing | missing | |
| 10022 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_123037__156 | 0 | 0.0 | 16.3647 | 0 | [202, 333] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_123037__156.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10023 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_123049__829 | 0 | 0.0 | 11.8429 | 0 | [202, 357] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_123049__829.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10024 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_014012__897 | 1 | 0.0 | 18.8181 | 1 | [202, 415] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231227_014012__897.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 10025 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_081006__474 | 0 | 0.0 | 15.4694 | 0 | [11, 425] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231214_081006__474.json | 0.0 | missing | missing | missing | |
| 10026 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_123241__948 | 1 | 0.0 | 11.1697 | 1 | [381, 306] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_123241__948.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 10027 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_123253__237 | 0 | 0.0 | 12.2955 | 0 | [381, 342] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_123253__237.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10028 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_014116__936 | 0 | 0.0 | 22.062 | 1 | [381, 637] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231227_014116__936.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 10029 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_080950__590 | 0 | 0.0 | 22.8869 | 0 | [370, 536] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231214_080950__590.json | 50.0 | missing | missing | missing | |
| 10030 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_123215__461 | 0 | 0.0 | 12.857 | 1 | [379, 359] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_123215__461.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 10031 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_123229__164 | 1 | 0.0 | 14.1804 | 1 | [379, 401] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_123229__164.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 10032 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_014054__288 | 0 | 0.0 | 12.641 | 0 | [379, 350] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231227_014054__288.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10033 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231214_081115__170 | 0 | 0.0 | 20.3033 | 0 | [53, 597] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__AsIs__1SHOT__20231214_081115__170.json | 0.0 | missing | missing | missing | |
| 10034 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231225_124251__233 | 0 | 0.0 | 107.718 | 0 | [66, 809] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__AsIs__1SHOT__20231225_124251__233.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10035 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231225_124417__140 | 0 | 0.0 | 85.7066 | 0 | [66, 646] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__AsIs__1SHOT__20231225_124417__140.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10036 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231214_081055__763 | 0 | 0.0 | 17.8315 | 0 | [70, 524] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__InJulia__1SHOT__20231214_081055__763.json | 25.0 | missing | missing | missing | |
| 10037 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231225_123848__320 | 0 | 0.0 | 109.408 | 0 | [69, 822] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_123848__320.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10038 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231225_124103__605 | 1 | 0.0 | 134.581 | 1 | [69, 1002] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_124103__605.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 10039 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231227_014510__408 | 0 | 0.0 | 70.8413 | 0 | [69, 533] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__InJulia__1SHOT__20231227_014510__408.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10040 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_081037__415 | 0 | 0.0 | 13.6236 | 0 | [99, 394] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231214_081037__415.json | 25.0 | missing | missing | missing | |
| 10041 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_123608__455 | 0 | 0.0 | 33.2181 | 0 | [108, 242] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_123608__455.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10042 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_123659__813 | 0 | 0.0 | 50.1418 | 0 | [108, 372] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_123659__813.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10043 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_014359__337 | 1 | 0.0 | 62.8498 | 1 | [108, 467] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231227_014359__337.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 10044 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_081023__306 | 0 | 0.0 | 17.1399 | 0 | [187, 463] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231214_081023__306.json | 25.0 | missing | missing | missing | |
| 10045 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_123442__694 | 0 | 0.0 | 109.179 | 0 | [197, 621] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_123442__694.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10046 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_123535__408 | 1 | 0.0 | 52.6172 | 1 | [197, 373] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_123535__408.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 10047 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_014255__455 | 0 | 0.0 | 97.788 | 3 | [197, 539] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231227_014255__455.json | 68.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 10048 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_081158__371 | 0 | 0.0 | 16.0121 | 0 | [11, 440] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231214_081158__371.json | 0.0 | missing | missing | missing | |
| 10049 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_124650__770 | 0 | 0.0 | 10.0321 | 0 | [382, 17] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_124650__770.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10050 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_124813__519 | 0 | 0.0 | 82.4902 | 1 | [382, 558] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_124813__519.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 10051 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_014714__686 | 1 | 0.0 | 84.0358 | 1 | [382, 567] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231227_014714__686.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 10052 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_081142__961 | 0 | 0.0 | 26.2699 | 0 | [370, 621] | 0.10.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231214_081142__961.json | 25.0 | missing | missing | missing | |
| 10053 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_124553__428 | 0 | 0.0 | 96.1783 | 0 | [380, 657] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_124553__428.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10054 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_124640__289 | 1 | 0.0 | 47.0937 | 1 | [380, 299] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_124640__289.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 10055 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_014550__618 | 0 | 0.0 | 40.2905 | 0 | [380, 247] | 0.10.0-DEV | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231227_014550__618.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10056 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231214_083300__260 | 0 | 0.0 | 15.3549 | 0 | [82, 449] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231214_083300__260.json | 0.0 | missing | missing | missing | |
| 10057 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231225_095935__407 | 0 | 0.0 | 16.5957 | 0 | [104, 297] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_095935__407.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10058 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231225_095950__700 | 0 | 0.0 | 15.5002 | 0 | [104, 277] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_095950__700.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10059 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231214_083245__590 | 0 | 0.0 | 10.1677 | 1 | [99, 294] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231214_083245__590.json | 58.3333 | missing | missing | missing | |
| 10060 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_095852__559 | 0 | 0.0 | 18.9375 | 2 | [107, 341] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_095852__559.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10061 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_095918__253 | 0 | 0.0 | 26.0104 | 0 | [107, 470] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_095918__253.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10062 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231227_023107__730 | 0 | 0.0 | 18.3067 | 2 | [107, 327] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231227_023107__730.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10063 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | InJulia | 1SHOT | true | false | 5 | 20231227_081626__775 | 0 | 0.0 | 8.66274 | 0 | [107, 147] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231227_081626__775.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10064 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_083235__267 | 0 | 0.0 | 2.74352 | 0 | [128, 60] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231214_083235__267.json | 50.0 | missing | missing | missing | |
| 10065 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_095825__934 | 0 | 0.0 | 16.8032 | 1 | [145, 294] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_095825__934.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10066 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_095833__643 | 0 | 0.0 | 7.49558 | 2 | [145, 119] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_095833__643.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10067 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_023048__360 | 0 | 0.0 | 22.5964 | 2 | [145, 400] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231227_023048__360.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10068 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_081617__675 | 0 | 0.0 | 12.8509 | 2 | [145, 220] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231227_081617__675.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10069 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_083232__945 | 0 | 0.0 | 19.0877 | 0 | [229, 496] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231214_083232__945.json | 50.0 | missing | missing | missing | |
| 10070 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_095751__741 | 0 | 0.0 | 29.9371 | 3 | [247, 334] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_095751__741.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10071 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_095808__674 | 0 | 0.0 | 15.9587 | 0 | [247, 256] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_095808__674.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10072 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_023026__478 | 0 | 0.0 | 30.1953 | 2 | [247, 349] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231227_023026__478.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10073 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_081604__194 | 0 | 0.0 | 23.1759 | 0 | [247, 220] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231227_081604__194.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10074 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_083351__576 | 0 | 0.0 | 30.824 | 0 | [11, 799] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231214_083351__576.json | 50.0 | missing | missing | missing | |
| 10075 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_100053__339 | 0 | 0.0 | 28.1159 | 1 | [410, 440] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_100053__339.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10076 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_100126__556 | 0 | 0.0 | 32.6241 | 0 | [410, 514] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_100126__556.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10077 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_023206__786 | 0 | 0.0 | 21.9796 | 1 | [410, 337] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231227_023206__786.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10078 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_081713__675 | 0 | 0.0 | 21.973 | 1 | [410, 337] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231227_081713__675.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10079 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_083320__385 | 0 | 0.0 | 19.228 | 0 | [399, 428] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231214_083320__385.json | 0.0 | missing | missing | missing | |
| 10080 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_100008__837 | 0 | 0.0 | 17.4432 | 0 | [407, 257] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_100008__837.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10081 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_100024__592 | 0 | 0.0 | 16.7871 | 1 | [407, 240] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_100024__592.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10082 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_023144__504 | 0 | 0.0 | 37.0013 | 2 | [407, 595] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231227_023144__504.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10083 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_081651__914 | 0 | 0.0 | 25.2054 | 0 | [407, 393] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231227_081651__914.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10084 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_043125__298 | 0 | 0.0 | 2.39311 | 2 | [0, 182] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_043125__298.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10085 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_043129__327 | 0 | 0.0 | 4.46013 | 0 | [0, 335] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_043129__327.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10086 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_043132__827 | 0 | 0.0 | 2.55367 | 1 | [0, 192] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_043132__827.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10087 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_043135__488 | 0 | 0.0 | 2.81969 | 0 | [0, 208] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_043135__488.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10088 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_043138__666 | 0 | 0.0 | 3.45478 | 3 | [0, 262] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_043138__666.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10089 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_043053__422 | 0 | 0.0 | 3.31591 | 3 | [0, 251] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_043053__422.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10090 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_043055__366 | 0 | 0.0 | 1.72466 | 0 | [0, 131] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_043055__366.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10091 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_043057__859 | 0 | 0.0 | 2.57508 | 2 | [0, 195] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_043057__859.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10092 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_043059__356 | 0 | 0.0 | 1.83382 | 3 | [0, 139] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_043059__356.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10093 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_043101__710 | 0 | 0.0 | 1.45163 | 2 | [0, 110] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_043101__710.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10094 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_043025__905 | 0 | 0.0 | 2.80503 | 0 | [0, 211] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_043025__905.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10095 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_043029__106 | 0 | 0.0 | 4.07541 | 0 | [0, 305] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_043029__106.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10096 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_043031__198 | 0 | 0.0 | 2.21736 | 0 | [0, 167] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_043031__198.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10097 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_043034__810 | 0 | 0.0 | 2.96817 | 2 | [0, 223] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_043034__810.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10098 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_043036__281 | 0 | 0.0 | 2.24285 | 2 | [0, 169] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_043036__281.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10099 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_043259__216 | 0 | 0.0 | 6.23329 | 2 | [0, 442] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_043259__216.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10100 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_043303__973 | 0 | 0.0 | 3.99394 | 0 | [0, 285] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_043303__973.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10101 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_043307__209 | 0 | 0.0 | 4.15269 | 0 | [0, 296] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_043307__209.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10102 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_043312__356 | 0 | 0.0 | 4.57438 | 1 | [0, 327] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_043312__356.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10103 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_043319__284 | 0 | 0.0 | 6.80242 | 3 | [0, 485] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_043319__284.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10104 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_043208__796 | 0 | 0.0 | 3.23571 | 0 | [0, 231] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_043208__796.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10105 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_043211__201 | 0 | 0.0 | 3.60585 | 1 | [0, 257] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_043211__201.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10106 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_043219__613 | 0 | 0.0 | 6.95024 | 2 | [0, 490] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_043219__613.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10107 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_043224__506 | 0 | 0.0 | 5.06393 | 1 | [0, 369] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_043224__506.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10108 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_043227__341 | 0 | 0.0 | 3.26505 | 0 | [0, 238] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_043227__341.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10109 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231214_083447__798 | 0 | 0.0 | 17.1334 | 0 | [82, 500] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__AsIs__1SHOT__20231214_083447__798.json | 0.0 | missing | missing | missing | |
| 10110 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231225_100258__390 | 0 | 0.0 | 27.7559 | 0 | [78, 499] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__AsIs__1SHOT__20231225_100258__390.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10111 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231225_100308__604 | 0 | 0.0 | 9.47509 | 0 | [78, 164] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__AsIs__1SHOT__20231225_100308__604.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10112 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231214_083430__766 | 0 | 0.0 | 14.2994 | 0 | [99, 414] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__InJulia__1SHOT__20231214_083430__766.json | 0.0 | missing | missing | missing | |
| 10113 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | InJulia | 1SHOT | true | false | 5 | 20231225_100223__568 | 0 | 0.0 | 7.44002 | 0 | [81, 125] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_100223__568.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10114 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_100230__434 | 0 | 0.0 | 7.48011 | 0 | [81, 127] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_100230__434.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10115 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_083415__768 | 0 | 0.0 | 10.619 | 0 | [128, 301] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231214_083415__768.json | 50.0 | missing | missing | missing | |
| 10116 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_100204__442 | 0 | 0.0 | 21.1519 | 0 | [82, 381] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_100204__442.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10117 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_100215__923 | 0 | 0.0 | 11.0315 | 0 | [82, 192] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_100215__923.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10118 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_083405__320 | 0 | 0.0 | 14.1434 | 1 | [229, 362] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231214_083405__320.json | 58.3333 | missing | missing | missing | |
| 10119 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_100141__735 | 0 | 0.0 | 15.3805 | 0 | [122, 84] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_100141__735.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10120 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_100143__983 | 0 | 0.0 | 2.0963 | 0 | [122, 20] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_100143__983.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10121 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_083533__725 | 0 | 0.0 | 17.2072 | 0 | [11, 466] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231214_083533__725.json | 25.0 | missing | missing | missing | |
| 10122 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_100347__874 | 0 | 0.0 | 15.5741 | 0 | [99, 273] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_100347__874.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10123 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_100351__261 | 0 | 0.0 | 4.27512 | 0 | [99, 62] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_100351__261.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10124 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_083515__394 | 0 | 0.0 | 28.2869 | 0 | [399, 655] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231214_083515__394.json | 0.0 | missing | missing | missing | |
| 10125 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_100315__736 | 0 | 0.0 | 7.34425 | 0 | [96, 123] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_100315__736.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10126 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_100331__554 | 0 | 0.0 | 16.236 | 0 | [96, 288] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_100331__554.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10127 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_043824__703 | 0 | 0.0 | 10.178 | 0 | [0, 366] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_043824__703.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10128 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_043833__143 | 0 | 0.0 | 8.86684 | 0 | [0, 319] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_043833__143.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10129 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_043846__269 | 0 | 0.0 | 13.7488 | 0 | [0, 493] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_043846__269.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10130 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_043855__451 | 0 | 0.0 | 8.25533 | 2 | [0, 297] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_043855__451.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10131 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_043910__377 | 0 | 0.0 | 14.9742 | 3 | [0, 536] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_043910__377.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10132 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_043649__417 | 0 | 0.0 | 3.24626 | 0 | [0, 117] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_043649__417.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10133 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_043659__396 | 0 | 0.0 | 9.62927 | 0 | [0, 346] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_043659__396.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10134 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_043706__553 | 0 | 0.0 | 7.35054 | 3 | [0, 264] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_043706__553.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10135 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_043713__411 | 0 | 0.0 | 6.75368 | 0 | [0, 243] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_043713__411.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10136 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_043724__347 | 0 | 0.0 | 10.5261 | 3 | [0, 378] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_043724__347.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10137 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_043557__741 | 0 | 0.0 | 0.514973 | 0 | [0, 18] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_043557__741.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10138 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_043602__241 | 0 | 0.0 | 5.23286 | 0 | [0, 184] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_043602__241.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10139 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_043606__408 | 0 | 0.0 | 4.25444 | 0 | [0, 150] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_043606__408.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10140 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_043608__457 | 0 | 0.0 | 1.59236 | 0 | [0, 56] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_043608__457.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10141 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_043609__735 | 0 | 0.0 | 1.1907 | 0 | [0, 42] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_043609__735.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10142 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_044133__556 | 0 | 0.0 | 4.32876 | 0 | [0, 154] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_044133__556.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10143 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_044135__796 | 0 | 0.0 | 2.1291 | 0 | [0, 76] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_044135__796.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10144 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_044145__748 | 0 | 0.0 | 9.7772 | 1 | [0, 347] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_044145__748.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10145 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_044203__845 | 0 | 0.0 | 17.7512 | 0 | [0, 626] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_044203__845.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10146 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_044206__338 | 0 | 0.0 | 2.75319 | 0 | [0, 98] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_044206__338.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10147 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_043942__918 | 0 | 0.0 | 7.57647 | 0 | [0, 269] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_043942__918.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10148 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_043952__333 | 0 | 0.0 | 10.8002 | 0 | [0, 383] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_043952__333.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10149 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_044003__776 | 0 | 0.0 | 10.2939 | 0 | [0, 365] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_044003__776.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10150 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_044014__234 | 0 | 0.0 | 11.7151 | 2 | [0, 415] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_044014__234.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10151 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_044028__283 | 0 | 0.0 | 12.939 | 0 | [0, 458] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_044028__283.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10152 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240201_042315__863 | 0 | 0.0 | 10.6071 | 0 | [0, 261] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_042315__863.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10153 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240201_042322__643 | 0 | 0.0 | 6.93374 | 0 | [0, 171] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_042322__643.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10154 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240201_042340__842 | 0 | 0.0 | 17.9063 | 0 | [0, 439] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_042340__842.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10155 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 5 | 20240201_042353__421 | 0 | 0.0 | 12.5396 | 0 | [0, 308] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_042353__421.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10156 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240201_042401__191 | 0 | 0.0 | 8.4613 | 0 | [0, 208] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_042401__191.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10157 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_042133__823 | 0 | 0.0 | 14.753 | 2 | [0, 362] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_042133__823.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10158 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_042137__616 | 0 | 0.0 | 3.7847 | 0 | [0, 93] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_042137__616.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10159 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_042139__968 | 0 | 0.0 | 1.58474 | 0 | [0, 39] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_042139__968.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10160 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_042158__829 | 0 | 0.0 | 19.636 | 0 | [0, 480] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_042158__829.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10161 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_042214__572 | 0 | 0.0 | 15.3544 | 0 | [0, 373] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_042214__572.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10162 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_041948__646 | 0 | 0.0 | 18.5162 | 1 | [0, 449] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_041948__646.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10163 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_041955__701 | 0 | 0.0 | 7.10198 | 0 | [0, 173] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_041955__701.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10164 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_042006__547 | 0 | 0.0 | 10.7276 | 1 | [0, 262] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_042006__547.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10165 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_042014__485 | 0 | 0.0 | 8.46453 | 0 | [0, 207] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_042014__485.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10166 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_042021__544 | 0 | 0.0 | 7.00542 | 0 | [0, 171] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_042021__544.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10167 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_042822__878 | 0 | 0.0 | 21.5205 | 3 | [0, 518] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_042822__878.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10168 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_042850__149 | 0 | 0.0 | 28.5168 | 0 | [0, 685] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_042850__149.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10169 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_042905__785 | 0 | 0.0 | 14.5693 | 0 | [0, 351] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_042905__785.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10170 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_042937__880 | 0 | 0.0 | 32.0655 | 0 | [0, 770] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_042937__880.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10171 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_042958__637 | 0 | 0.0 | 20.9212 | 0 | [0, 503] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_042958__637.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10172 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_042547__720 | 0 | 0.0 | 16.3012 | 0 | [0, 393] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_042547__720.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10173 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_042550__268 | 0 | 0.0 | 2.92294 | 0 | [0, 71] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_042550__268.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10174 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_042608__440 | 0 | 0.0 | 18.5881 | 0 | [0, 449] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_042608__440.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10175 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_042629__729 | 0 | 0.0 | 20.4001 | 0 | [0, 492] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_042629__729.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10176 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_042646__234 | 0 | 0.0 | 16.6306 | 0 | [0, 402] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_042646__234.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10177 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_040201__306 | 0 | 0.0 | 17.3249 | 0 | [0, 324] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_040201__306.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10178 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_040207__335 | 0 | 0.0 | 5.78882 | 0 | [0, 108] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_040207__335.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10179 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_040249__801 | 0 | 0.0 | 42.1207 | 3 | [0, 781] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_040249__801.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10180 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_040316__137 | 0 | 0.0 | 26.5688 | 0 | [0, 495] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_040316__137.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10181 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_040343__320 | 0 | 0.0 | 27.0098 | 1 | [0, 503] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_040343__320.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10182 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_035957__919 | 0 | 0.0 | 13.7538 | 0 | [0, 257] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_035957__919.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10183 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_040006__472 | 0 | 0.0 | 8.77758 | 0 | [0, 164] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_040006__472.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10184 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_040028__298 | 0 | 0.0 | 22.3791 | 0 | [0, 417] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_040028__298.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10185 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_040042__924 | 0 | 0.0 | 14.3167 | 1 | [0, 267] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_040042__924.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10186 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_040044__221 | 0 | 0.0 | 1.7705 | 0 | [0, 33] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_040044__221.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10187 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_035708__190 | 0 | 0.0 | 15.6354 | 2 | [0, 291] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_035708__190.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10188 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_035725__659 | 0 | 0.0 | 17.2555 | 0 | [0, 320] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_035725__659.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10189 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_035755__325 | 0 | 0.0 | 29.7685 | 3 | [0, 550] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_035755__325.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10190 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_035809__299 | 0 | 0.0 | 14.3274 | 0 | [0, 266] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_035809__299.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10191 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_035833__459 | 0 | 0.0 | 23.4263 | 0 | [0, 433] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_035833__459.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10192 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_041707__243 | 0 | 0.0 | 37.4966 | 1 | [0, 690] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_041707__243.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10193 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_041716__709 | 0 | 0.0 | 7.99389 | 0 | [0, 148] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_041716__709.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10194 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_041732__773 | 0 | 0.0 | 16.4128 | 0 | [0, 303] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_041732__773.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10195 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_041742__505 | 0 | 0.0 | 10.1129 | 0 | [0, 187] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_041742__505.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10196 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_041754__403 | 0 | 0.0 | 11.7896 | 2 | [0, 218] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_041754__403.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10197 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_041330__603 | 0 | 0.0 | 70.8891 | 0 | [413, 540] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_041330__603.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10198 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_041345__460 | 0 | 0.0 | 14.9256 | 2 | [0, 275] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_041345__460.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10199 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_041358__143 | 0 | 0.0 | 12.3819 | 0 | [0, 228] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_041358__143.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10200 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_041410__625 | 0 | 0.0 | 12.7665 | 0 | [0, 235] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_041410__625.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10201 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_041429__138 | 0 | 0.0 | 18.3036 | 0 | [0, 336] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_041429__138.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10202 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_043411__151 | 0 | 0.0 | 3.23488 | 2 | [0, 387] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_043411__151.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10203 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_043414__891 | 0 | 0.0 | 2.85502 | 0 | [0, 343] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_043414__891.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10204 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_043418__874 | 0 | 0.0 | 3.33159 | 3 | [0, 400] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_043418__874.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10205 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_043420__760 | 0 | 0.0 | 2.26705 | 3 | [0, 273] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_043420__760.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10206 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_043422__810 | 0 | 0.0 | 2.27963 | 2 | [0, 274] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_043422__810.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10207 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_043351__175 | 0 | 0.0 | 1.16814 | 0 | [0, 141] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_043351__175.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10208 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_043351__581 | 0 | 0.0 | 0.920183 | 0 | [0, 110] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_043351__581.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10209 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_043352__317 | 0 | 0.0 | 0.910785 | 0 | [0, 110] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_043352__317.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10210 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_043353__331 | 0 | 0.0 | 0.738013 | 0 | [0, 89] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_043353__331.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10211 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_043355__217 | 0 | 0.0 | 1.48241 | 0 | [0, 179] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_043355__217.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10212 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_043334__748 | 0 | 0.0 | 1.74167 | 0 | [0, 202] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_043334__748.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10213 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_043335__833 | 0 | 0.0 | 1.57077 | 0 | [0, 182] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_043335__833.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10214 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_043337__251 | 0 | 0.0 | 1.70102 | 0 | [0, 197] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_043337__251.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10215 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_043338__752 | 0 | 0.0 | 1.6898 | 0 | [0, 196] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_043338__752.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10216 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_043341__515 | 0 | 0.0 | 2.16212 | 0 | [0, 250] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_043341__515.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10217 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_043509__943 | 0 | 0.0 | 4.48802 | 0 | [0, 520] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_043509__943.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10218 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_043512__868 | 0 | 0.0 | 2.97372 | 1 | [0, 347] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_043512__868.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10219 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_043514__993 | 0 | 0.0 | 1.90566 | 0 | [0, 224] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_043514__993.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10220 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_043517__322 | 0 | 0.0 | 2.55014 | 1 | [0, 299] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_043517__322.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10221 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_043521__282 | 0 | 0.0 | 3.95031 | 1 | [0, 460] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_043521__282.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10222 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_043439__738 | 0 | 0.0 | 3.3534 | 3 | [0, 392] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_043439__738.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10223 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_043441__239 | 0 | 0.0 | 1.78122 | 3 | [0, 209] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_043441__239.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10224 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_043443__917 | 0 | 0.0 | 2.56845 | 3 | [0, 300] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_043443__917.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10225 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_043446__340 | 0 | 0.0 | 2.15407 | 0 | [0, 253] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_043446__340.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10226 | NVIDIA-RTX-4090-4x | q_and_a_extractor | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_043449__780 | 0 | 0.0 | 2.4777 | 3 | [0, 291] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_043449__780.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10227 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_111945__483 | 0 | 0.0 | 61.5505 | 0 | [104, 360] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_111945__483.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10228 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_112039__484 | 0 | 0.0 | 54.5968 | 0 | [104, 311] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_112039__484.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10229 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231225_111843__906 | 0 | 0.0 | 48.7137 | 0 | [107, 283] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_111843__906.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10230 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_082923__513 | 0 | 0.0 | 63.564 | 2 | [107, 380] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_082923__513.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10231 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_102435__200 | 0 | 0.0 | 50.6783 | 0 | [148, 295] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_102435__200.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10232 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_102514__119 | 0 | 0.0 | 39.7093 | 0 | [148, 227] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_102514__119.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10233 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_082819__573 | 0 | 0.0 | 44.6934 | 0 | [148, 257] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_082819__573.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10234 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_102257__567 | 0 | 0.0 | 77.1385 | 0 | [250, 277] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_102257__567.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10235 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_102344__800 | 0 | 0.0 | 47.1144 | 0 | [250, 256] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_102344__800.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10236 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_082734__750 | 0 | 0.0 | 91.8928 | 0 | [250, 377] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_082734__750.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10237 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_112409__143 | 0 | 0.0 | 72.6561 | 0 | [436, 375] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_112409__143.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10238 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_112536__495 | 0 | 0.0 | 86.5173 | 0 | [436, 458] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_112536__495.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10239 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_083159__492 | 0 | 0.0 | 76.5146 | 3 | [436, 398] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_083159__492.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10240 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_112130__292 | 0 | 0.0 | 50.5665 | 3 | [434, 235] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_112130__292.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10241 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_112256__889 | 0 | 0.0 | 85.5161 | 0 | [434, 442] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_112256__889.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10242 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_083042__182 | 0 | 0.0 | 78.9185 | 0 | [434, 412] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_083042__182.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10243 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_084044__258 | 0 | 0.0 | 10.1725 | 0 | [102, 386] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_084044__258.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10244 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_125708__394 | 0 | 0.0 | 9.26925 | 0 | [102, 345] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_125708__394.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10245 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_125728__278 | 0 | 0.0 | 20.3521 | 0 | [102, 745] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_125728__278.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10246 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_125739__654 | 0 | 0.0 | 10.6576 | 0 | [102, 404] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_125739__654.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10247 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_084033__410 | 0 | 0.0 | 5.95006 | 0 | [139, 219] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_084033__410.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10248 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_125635__419 | 0 | 0.0 | 3.49661 | 0 | [139, 119] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_125635__419.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10249 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_125641__189 | 0 | 0.0 | 6.55809 | 0 | [139, 232] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_125641__189.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10250 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_125659__581 | 0 | 0.0 | 17.2792 | 0 | [139, 616] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_125659__581.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10251 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_084028__555 | 0 | 0.0 | 11.4824 | 0 | [238, 282] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_084028__555.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10252 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_125623__899 | 0 | 0.0 | 8.2374 | 0 | [238, 150] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_125623__899.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10253 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_125627__172 | 0 | 0.0 | 4.30891 | 0 | [238, 136] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_125627__172.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10254 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_125631__187 | 0 | 0.0 | 4.14138 | 0 | [238, 129] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_125631__187.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10255 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_084057__719 | 0 | 0.0 | 6.96847 | 0 | [391, 212] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_084057__719.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10256 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_125815__592 | 0 | 0.0 | 9.32438 | 0 | [391, 297] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_125815__592.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10257 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_125827__862 | 0 | 0.0 | 11.381 | 0 | [391, 366] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_125827__862.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10258 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_125836__672 | 0 | 0.0 | 9.78339 | 0 | [391, 313] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_125836__672.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10259 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_084050__449 | 0 | 0.0 | 5.99048 | 0 | [388, 176] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_084050__449.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10260 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_125748__860 | 0 | 0.0 | 8.84251 | 0 | [388, 280] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_125748__860.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10261 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_125757__645 | 0 | 0.0 | 9.46161 | 0 | [388, 302] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_125757__645.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10262 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_125806__582 | 0 | 0.0 | 8.46973 | 0 | [388, 266] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_125806__582.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10263 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemini-1.0-pro-latest | InJulia | 1SHOT | false | false | 5 | 20240217_112030__856 | 0 | 0.0 | 6.92361 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_112030__856.json | 0.0 | missing | missing | missing | |
| 10264 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_112036__775 | 0 | 0.0 | 6.25794 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_112036__775.json | 50.0 | missing | missing | missing | |
| 10265 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_112047__555 | 0 | 0.0 | 10.8139 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_112047__555.json | 50.0 | missing | missing | missing | |
| 10266 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemini-1.0-pro-latest | InJulia | 1SHOT | false | false | 5 | 20240217_112051__286 | 0 | 0.0 | 4.54701 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_112051__286.json | 0.0 | missing | missing | missing | |
| 10267 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_112055__608 | 0 | 0.0 | 3.12045 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_112055__608.json | 50.0 | missing | missing | missing | |
| 10268 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240217_111931__352 | 0 | 0.0 | 2.17009 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_111931__352.json | 25.0 | missing | missing | missing | |
| 10269 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240217_111939__464 | 0 | 0.0 | 7.20348 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_111939__464.json | 25.0 | missing | missing | missing | |
| 10270 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_111942__921 | 0 | 0.0 | 3.49111 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_111942__921.json | 50.0 | missing | missing | missing | |
| 10271 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240217_111953__326 | 0 | 0.0 | 10.7046 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_111953__326.json | 25.0 | missing | missing | missing | |
| 10272 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240217_111956__159 | 0 | 0.0 | 2.77671 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_111956__159.json | 0.0 | missing | missing | missing | |
| 10273 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240217_111826__712 | 0 | 0.0 | 3.10333 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_111826__712.json | 0.0 | missing | missing | missing | |
| 10274 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240217_111829__117 | 0 | 0.0 | 2.4933 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_111829__117.json | 50.0 | missing | missing | missing | |
| 10275 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240217_111839__494 | 0 | 0.0 | 10.4041 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_111839__494.json | 50.0 | missing | missing | missing | |
| 10276 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240217_111851__750 | 0 | 0.0 | 11.8836 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_111851__750.json | 50.0 | missing | missing | missing | |
| 10277 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240217_111853__181 | 0 | 0.0 | 1.96442 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_111853__181.json | 50.0 | missing | missing | missing | |
| 10278 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240217_112200__937 | 0 | 0.0 | 2.42211 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_112200__937.json | 0.0 | missing | missing | missing | |
| 10279 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240217_112206__463 | 0 | 0.0 | 5.15661 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_112206__463.json | 0.0 | missing | missing | missing | |
| 10280 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240217_112209__168 | 0 | 0.0 | 3.57419 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_112209__168.json | 0.0 | missing | missing | missing | |
| 10281 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240217_112213__693 | 0 | 0.0 | 3.45751 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_112213__693.json | 50.0 | missing | missing | missing | |
| 10282 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240217_112216__413 | 0 | 0.0 | 3.18271 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_112216__413.json | 0.0 | missing | missing | missing | |
| 10283 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_112113__642 | 0 | 0.0 | 4.06894 | 3 | [0, 0] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_112113__642.json | 75.0 | missing | missing | missing | |
| 10284 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20240217_112121__745 | 0 | 0.0 | 7.95191 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_112121__745.json | 0.0 | missing | missing | missing | |
| 10285 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_112125__862 | 0 | 0.0 | 3.24138 | 1 | [0, 0] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_112125__862.json | 58.3333 | missing | missing | missing | |
| 10286 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_112128__311 | 0 | 0.0 | 2.79965 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_112128__311.json | 50.0 | missing | missing | missing | |
| 10287 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20240217_112135__759 | 0 | 0.0 | 7.25804 | 0 | [0, 0] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_112135__759.json | 0.0 | missing | missing | missing | |
| 10288 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 5 | 20240224_001959__522 | 0 | 0.0 | 24.0824 | 0 | [0, 369] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240224_001959__522.json | 0.0 | missing | missing | missing | |
| 10289 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 5 | 20240224_002025__829 | 0 | 0.0 | 26.3018 | 0 | [0, 402] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240224_002025__829.json | 0.0 | missing | missing | missing | |
| 10290 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 5 | 20240224_002048__920 | 0 | 0.0 | 22.4419 | 0 | [0, 344] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240224_002048__920.json | 0.0 | missing | missing | missing | |
| 10291 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 5 | 20240224_002110__778 | 0 | 0.0 | 22.1042 | 0 | [0, 335] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240224_002110__778.json | 0.0 | missing | missing | missing | |
| 10292 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 5 | 20240224_002129__120 | 0 | 0.0 | 19.0048 | 0 | [0, 287] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240224_002129__120.json | 0.0 | missing | missing | missing | |
| 10293 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240224_001636__489 | 0 | 0.0 | 20.443 | 1 | [0, 316] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240224_001636__489.json | 58.3333 | missing | missing | missing | |
| 10294 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240224_001658__677 | 0 | 0.0 | 21.7348 | 0 | [0, 329] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240224_001658__677.json | 0.0 | missing | missing | missing | |
| 10295 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240224_001716__930 | 0 | 0.0 | 18.3784 | 0 | [0, 282] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240224_001716__930.json | 50.0 | missing | missing | missing | |
| 10296 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240224_001723__602 | 0 | 0.0 | 7.01863 | 0 | [0, 111] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240224_001723__602.json | 25.0 | missing | missing | missing | |
| 10297 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240224_001729__170 | 0 | 0.0 | 6.13338 | 0 | [0, 92] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240224_001729__170.json | 25.0 | missing | missing | missing | |
| 10298 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240224_001306__642 | 0 | 0.0 | 27.0355 | 0 | [0, 414] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240224_001306__642.json | 25.0 | missing | missing | missing | |
| 10299 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240224_001336__496 | 0 | 0.0 | 30.2296 | 0 | [0, 463] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240224_001336__496.json | 0.0 | missing | missing | missing | |
| 10300 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240224_001401__616 | 0 | 0.0 | 24.7126 | 0 | [0, 379] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240224_001401__616.json | 25.0 | missing | missing | missing | |
| 10301 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240224_001429__423 | 0 | 0.0 | 28.7164 | 0 | [0, 439] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240224_001429__423.json | 25.0 | missing | missing | missing | |
| 10302 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240224_001454__229 | 0 | 0.0 | 24.832 | 0 | [0, 382] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240224_001454__229.json | 25.0 | missing | missing | missing | |
| 10303 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240224_002829__464 | 0 | 0.0 | 27.4625 | 1 | [0, 418] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240224_002829__464.json | 58.3333 | missing | missing | missing | |
| 10304 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240224_002857__756 | 0 | 0.0 | 27.3591 | 0 | [0, 413] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240224_002857__756.json | 0.0 | missing | missing | missing | |
| 10305 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240224_002923__298 | 0 | 0.0 | 26.3654 | 0 | [0, 400] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240224_002923__298.json | 0.0 | missing | missing | missing | |
| 10306 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240224_002952__898 | 0 | 0.0 | 28.647 | 0 | [0, 438] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240224_002952__898.json | 50.0 | missing | missing | missing | |
| 10307 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240224_003015__355 | 0 | 0.0 | 23.0078 | 0 | [0, 344] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240224_003015__355.json | 0.0 | missing | missing | missing | |
| 10308 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240224_002401__543 | 0 | 0.0 | 28.1148 | 0 | [0, 425] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240224_002401__543.json | 0.0 | missing | missing | missing | |
| 10309 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240224_002426__494 | 0 | 0.0 | 24.542 | 0 | [0, 373] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240224_002426__494.json | 50.0 | missing | missing | missing | |
| 10310 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240224_002454__263 | 0 | 0.0 | 27.5548 | 0 | [0, 414] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240224_002454__263.json | 0.0 | missing | missing | missing | |
| 10311 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240224_002519__144 | 0 | 0.0 | 25.7187 | 0 | [0, 390] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240224_002519__144.json | 0.0 | missing | missing | missing | |
| 10312 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240224_002545__759 | 0 | 0.0 | 25.604 | 0 | [0, 383] | 0.13.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240224_002545__759.json | 50.0 | missing | missing | missing | |
| 10313 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 5 | 20231213_204752__855 | 0 | 0.0003365 | 4.27869 | 0 | [88, 195] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231213_204752__855.json | 0.0 | missing | missing | missing | |
| 10314 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 5 | 20231225_225709__750 | 0 | 0.0007355 | 6.4248 | 0 | [88, 461] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_225709__750.json | 0.0 | missing | missing | missing | |
| 10315 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | AsIs | 1SHOT | false | false | 5 | 20231225_225712__199 | 0 | 0.0004025 | 3.5547 | 0 | [88, 239] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_225712__199.json | 0.0 | missing | missing | missing | |
| 10316 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo--optim | AsIs | 1SHOT | false | false | 5 | 20231215_201032__351 | 0 | 0.0 | 5.95939 | 0 | [88, 245] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231215_201032__351.json | 0.0 | 0.5 | missing | 0.5 | |
| 10317 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231213_204748__278 | 0 | 0.000668 | 9.46467 | 2 | [91, 415] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231213_204748__278.json | 66.6667 | missing | missing | missing | |
| 10318 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231225_225658__646 | 0 | 0.0005405 | 4.51322 | 3 | [91, 330] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_225658__646.json | 75.0 | missing | missing | missing | |
| 10319 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231225_225702__630 | 0 | 0.000467 | 4.20636 | 0 | [91, 281] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_225702__630.json | 50.0 | missing | missing | missing | |
| 10320 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231227_204418__718 | 0 | 0.000419 | 4.13904 | 1 | [91, 249] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_204418__718.json | 58.3333 | missing | missing | missing | |
| 10321 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231227_204424__780 | 0 | 0.00056 | 5.97497 | 0 | [91, 343] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_204424__780.json | 50.0 | missing | missing | missing | |
| 10322 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo--optim | InJulia | 1SHOT | true | true | 5 | 20231215_201026__116 | 0 | 0.0 | 5.96681 | 3 | [91, 288] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231215_201026__116.json | 75.0 | 0.5 | missing | 0.5 | |
| 10323 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_204738__566 | 0 | 0.000384 | 6.75723 | 1 | [126, 214] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231213_204738__566.json | 58.3333 | missing | missing | missing | |
| 10324 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_225652__661 | 0 | 0.0004875 | 3.97299 | 2 | [126, 283] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_225652__661.json | 66.6667 | missing | missing | missing | |
| 10325 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_225653__802 | 0 | 0.0001485 | 1.18304 | 0 | [126, 57] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_225653__802.json | 0.0 | missing | missing | missing | |
| 10326 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_204412__431 | 0 | 0.0004215 | 4.48628 | 0 | [126, 239] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_204412__431.json | 0.0 | missing | missing | missing | |
| 10327 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_204414__847 | 0 | 0.0002205 | 2.05193 | 3 | [126, 105] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_204414__847.json | 75.0 | missing | missing | missing | |
| 10328 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_201020__123 | 0 | 0.0 | 2.12891 | 1 | [126, 74] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231215_201020__123.json | 58.3333 | 0.5 | missing | 0.5 | |
| 10329 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_204731__427 | 0 | 0.0005885 | 8.25251 | 2 | [208, 323] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231213_204731__427.json | 66.6667 | missing | missing | missing | |
| 10330 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_225645__965 | 0 | 0.000368 | 2.78619 | 0 | [208, 176] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_225645__965.json | 50.0 | missing | missing | missing | |
| 10331 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_225647__545 | 0 | 0.0003005 | 1.89372 | 0 | [208, 131] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_225647__545.json | 50.0 | missing | missing | missing | |
| 10332 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_204406__319 | 0 | 0.000233 | 1.89706 | 0 | [208, 86] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_204406__319.json | 0.0 | missing | missing | missing | |
| 10333 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_204407__505 | 0 | 0.0002375 | 1.58868 | 0 | [208, 89] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_204407__505.json | 0.0 | missing | missing | missing | |
| 10334 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo--optim | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231215_201017__700 | 0 | 0.0 | 2.76878 | 0 | [208, 112] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231215_201017__700.json | 0.0 | 0.5 | missing | 0.5 | |
| 10335 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_204802__417 | 0 | 0.0006245 | 7.4018 | 0 | [349, 300] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231213_204802__417.json | 50.0 | missing | missing | missing | |
| 10336 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_225719__770 | 0 | 0.0002465 | 0.913331 | 0 | [349, 48] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_225719__770.json | 0.0 | missing | missing | missing | |
| 10337 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_225721__827 | 0 | 0.000365 | 2.25595 | 0 | [349, 127] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_225721__827.json | 0.0 | missing | missing | missing | |
| 10338 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_204433__481 | 0 | 0.000425 | 3.15476 | 0 | [349, 167] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_204433__481.json | 0.0 | missing | missing | missing | |
| 10339 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_204438__657 | 0 | 0.000647 | 4.86888 | 3 | [349, 315] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_204438__657.json | 75.0 | missing | missing | missing | |
| 10340 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo--optim | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231215_201035__385 | 0 | 0.0 | 1.55645 | 0 | [349, 64] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231215_201035__385.json | 0.0 | 0.5 | missing | 0.5 | |
| 10341 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_204755__529 | 0 | 0.0003135 | 2.57655 | 0 | [348, 93] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231213_204755__529.json | 0.0 | missing | missing | missing | |
| 10342 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_225714__179 | 0 | 0.0003765 | 2.01658 | 0 | [348, 135] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_225714__179.json | 0.0 | missing | missing | missing | |
| 10343 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_225718__292 | 0 | 0.000618 | 3.88624 | 0 | [348, 296] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_225718__292.json | 0.0 | missing | missing | missing | |
| 10344 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_204426__587 | 0 | 0.000321 | 1.7196 | 0 | [348, 98] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_204426__587.json | 0.0 | missing | missing | missing | |
| 10345 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_204430__169 | 0 | 0.000459 | 3.57461 | 0 | [348, 190] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_204430__169.json | 0.0 | missing | missing | missing | |
| 10346 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo--optim | JuliaRecapTask | 1SHOT | false | false | 5 | 20231215_201034__169 | 0 | 0.0 | 1.87253 | 0 | [348, 68] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231215_201034__169.json | 0.0 | 0.5 | missing | 0.5 | |
| 10347 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200645__124 | 0 | 0.000317 | 1.56933 | 0 | [91, 181] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200645__124.json | 50.0 | missing | missing | missing | |
| 10348 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200647__398 | 0 | 0.000392 | 1.88869 | 1 | [91, 231] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200647__398.json | 58.3333 | missing | missing | missing | |
| 10349 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200650__579 | 0 | 0.0004895 | 2.29447 | 3 | [91, 296] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200650__579.json | 75.0 | missing | missing | missing | |
| 10350 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200651__308 | 0 | 0.0002945 | 1.31873 | 0 | [91, 166] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200651__308.json | 50.0 | missing | missing | missing | |
| 10351 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200652__284 | 0 | 0.000227 | 1.187 | 0 | [91, 121] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200652__284.json | 50.0 | missing | missing | missing | |
| 10352 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200640__520 | 0 | 0.000243 | 1.18965 | 0 | [126, 120] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200640__520.json | 50.0 | missing | missing | missing | |
| 10353 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200641__430 | 0 | 0.0001725 | 0.789047 | 0 | [126, 73] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200641__430.json | 50.0 | missing | missing | missing | |
| 10354 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200642__964 | 0 | 0.00015 | 0.817819 | 1 | [126, 58] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200642__964.json | 58.3333 | missing | missing | missing | |
| 10355 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200643__647 | 0 | 0.0001725 | 0.921815 | 0 | [126, 73] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200643__647.json | 50.0 | missing | missing | missing | |
| 10356 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200644__510 | 0 | 0.0001695 | 0.735478 | 0 | [126, 71] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200644__510.json | 50.0 | missing | missing | missing | |
| 10357 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_200635__516 | 0 | 0.0002945 | 1.24149 | 0 | [208, 127] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200635__516.json | 25.0 | missing | missing | missing | |
| 10358 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_200636__101 | 0 | 0.000266 | 1.12106 | 0 | [208, 108] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200636__101.json | 25.0 | missing | missing | missing | |
| 10359 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240201_200637__179 | 0 | 0.0003275 | 1.15996 | 0 | [208, 149] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200637__179.json | 25.0 | missing | missing | missing | |
| 10360 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200638__842 | 0 | 0.0002435 | 1.03856 | 1 | [208, 93] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200638__842.json | 58.3333 | missing | missing | missing | |
| 10361 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_200639__376 | 0 | 0.0002765 | 1.04353 | 0 | [208, 115] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200639__376.json | 0.0 | missing | missing | missing | |
| 10362 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200700__145 | 0 | 0.000302 | 1.04252 | 0 | [349, 85] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200700__145.json | 0.0 | missing | missing | missing | |
| 10363 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200701__625 | 0 | 0.0003035 | 0.923207 | 0 | [349, 86] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200701__625.json | 0.0 | missing | missing | missing | |
| 10364 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200701__773 | 0 | 0.00023 | 0.708539 | 0 | [349, 37] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200701__773.json | 0.0 | missing | missing | missing | |
| 10365 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200702__828 | 0 | 0.0002555 | 0.687881 | 0 | [349, 54] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200702__828.json | 0.0 | missing | missing | missing | |
| 10366 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200703__772 | 0 | 0.0002885 | 0.792552 | 0 | [349, 76] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200703__772.json | 0.0 | missing | missing | missing | |
| 10367 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200653__363 | 0 | 0.0003045 | 0.865859 | 0 | [348, 87] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200653__363.json | 50.0 | missing | missing | missing | |
| 10368 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200655__101 | 0 | 0.0004725 | 1.68256 | 2 | [348, 199] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200655__101.json | 66.6667 | missing | missing | missing | |
| 10369 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200656__192 | 0 | 0.000378 | 1.12048 | 1 | [348, 136] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200656__192.json | 58.3333 | missing | missing | missing | |
| 10370 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_200657__638 | 0 | 0.00033 | 0.951782 | 0 | [348, 104] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200657__638.json | 0.0 | missing | missing | missing | |
| 10371 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200659__487 | 0 | 0.0004395 | 1.59668 | 1 | [348, 177] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200659__487.json | 58.3333 | missing | missing | missing | |
| 10372 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 5 | 20231213_204817__530 | 0 | 0.000596 | 4.70524 | 0 | [88, 254] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231213_204817__530.json | 0.0 | missing | missing | missing | |
| 10373 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 5 | 20231225_225734__712 | 0 | 0.000298 | 1.6554 | 0 | [88, 105] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_225734__712.json | 0.0 | missing | missing | missing | |
| 10374 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 5 | 20231225_225737__765 | 0 | 0.00051 | 2.39609 | 0 | [88, 211] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_225737__765.json | 0.0 | missing | missing | missing | |
| 10375 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106--optim | AsIs | 1SHOT | false | false | 5 | 20231215_201048__721 | 0 | 0.0 | 4.44482 | 0 | [88, 182] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231215_201048__721.json | 0.0 | 0.9 | missing | 0.1 | |
| 10376 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231213_204812__680 | 0 | 0.000603 | 4.79366 | 1 | [91, 256] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231213_204812__680.json | 58.3333 | missing | missing | missing | |
| 10377 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231225_225730__697 | 0 | 0.000575 | 2.6699 | 0 | [91, 242] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_225730__697.json | 50.0 | missing | missing | missing | |
| 10378 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231225_225732__794 | 0 | 0.000437 | 2.40582 | 3 | [91, 173] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_225732__794.json | 75.0 | missing | missing | missing | |
| 10379 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | InJulia | 1SHOT | false | false | 5 | 20231227_204447__726 | 0 | 0.000415 | 2.57266 | 0 | [91, 162] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_204447__726.json | 0.0 | missing | missing | missing | |
| 10380 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231227_204450__287 | 0 | 0.000451 | 2.35523 | 3 | [91, 180] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_204450__287.json | 75.0 | missing | missing | missing | |
| 10381 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106--optim | InJulia | 1SHOT | true | true | 5 | 20231215_201043__863 | 0 | 0.0 | 2.98443 | 1 | [91, 163] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231215_201043__863.json | 58.3333 | 0.9 | missing | 0.1 | |
| 10382 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_204807__602 | 0 | 0.000256 | 1.81419 | 1 | [126, 65] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231213_204807__602.json | 58.3333 | missing | missing | missing | |
| 10383 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_225726__517 | 0 | 0.000236 | 1.04937 | 1 | [126, 55] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_225726__517.json | 58.3333 | missing | missing | missing | |
| 10384 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_225727__982 | 0 | 0.000272 | 1.12921 | 0 | [126, 73] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_225727__982.json | 50.0 | missing | missing | missing | |
| 10385 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_204443__424 | 0 | 0.000308 | 1.81417 | 0 | [126, 91] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_204443__424.json | 50.0 | missing | missing | missing | |
| 10386 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_204445__982 | 0 | 0.000292 | 1.40265 | 0 | [126, 83] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_204445__982.json | 50.0 | missing | missing | missing | |
| 10387 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_201040__918 | 0 | 0.0 | 2.4932 | 1 | [126, 66] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231215_201040__918.json | 58.3333 | 0.9 | missing | 0.1 | |
| 10388 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_204805__436 | 0 | 0.00036 | 2.27963 | 0 | [208, 76] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231213_204805__436.json | 50.0 | missing | missing | missing | |
| 10389 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_225723__817 | 0 | 0.00043 | 1.8317 | 0 | [208, 111] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_225723__817.json | 25.0 | missing | missing | missing | |
| 10390 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_225725__136 | 0 | 0.00041 | 1.3795 | 0 | [208, 101] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_225725__136.json | 25.0 | missing | missing | missing | |
| 10391 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_204440__991 | 0 | 0.000408 | 1.71869 | 0 | [208, 100] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_204440__991.json | 25.0 | missing | missing | missing | |
| 10392 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_204441__672 | 0 | 0.000398 | 1.61819 | 1 | [208, 95] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_204441__672.json | 58.3333 | missing | missing | missing | |
| 10393 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106--optim | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231215_201037__402 | 0 | 0.0 | 2.02709 | 0 | [208, 106] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231215_201037__402.json | 25.0 | 0.9 | missing | 0.1 | |
| 10394 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_204823__937 | 0 | 0.000539 | 1.99123 | 3 | [349, 95] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231213_204823__937.json | 75.0 | missing | missing | missing | |
| 10395 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_225740__675 | 0 | 0.000435 | 0.956485 | 0 | [349, 43] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_225740__675.json | 0.0 | missing | missing | missing | |
| 10396 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_225741__798 | 0 | 0.000531 | 1.20566 | 0 | [349, 91] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_225741__798.json | 0.0 | missing | missing | missing | |
| 10397 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_204454__183 | 0 | 0.000583 | 2.2596 | 0 | [349, 117] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_204454__183.json | 0.0 | missing | missing | missing | |
| 10398 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_204456__614 | 0 | 0.000537 | 1.88494 | 0 | [349, 94] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_204456__614.json | 0.0 | missing | missing | missing | |
| 10399 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106--optim | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231215_201051__814 | 0 | 0.0 | 1.28406 | 0 | [349, 74] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231215_201051__814.json | 0.0 | 0.9 | missing | 0.1 | |
| 10400 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_204821__187 | 0 | 0.000544 | 3.813 | 2 | [348, 98] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231213_204821__187.json | 66.6667 | missing | missing | missing | |
| 10401 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_225738__222 | 0 | 0.00046 | 0.982561 | 0 | [348, 56] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_225738__222.json | 0.0 | missing | missing | missing | |
| 10402 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_225739__953 | 0 | 0.000552 | 1.47454 | 2 | [348, 102] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_225739__953.json | 66.6667 | missing | missing | missing | |
| 10403 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_204451__822 | 0 | 0.00043 | 1.01494 | 0 | [348, 41] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_204451__822.json | 0.0 | missing | missing | missing | |
| 10404 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_204452__977 | 0 | 0.000464 | 1.02377 | 0 | [348, 58] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_204452__977.json | 0.0 | missing | missing | missing | |
| 10405 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-3.5-turbo-1106--optim | JuliaRecapTask | 1SHOT | false | false | 5 | 20231215_201050__821 | 0 | 0.0 | 2.43352 | 0 | [348, 99] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231215_201050__821.json | 0.0 | 0.9 | missing | 0.1 | |
| 10406 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-0125-preview | InJulia | 1SHOT | false | false | 5 | 20240201_111819__880 | 0 | 0.01696 | 24.8069 | 0 | [91, 535] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_111819__880.json | 0.0 | missing | missing | missing | |
| 10407 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_111858__291 | 0 | 0.01648 | 38.3155 | 2 | [91, 519] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_111858__291.json | 66.6667 | missing | missing | missing | |
| 10408 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-0125-preview | InJulia | 1SHOT | false | false | 5 | 20240201_111932__906 | 0 | 0.01762 | 34.7624 | 0 | [91, 557] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_111932__906.json | 0.0 | missing | missing | missing | |
| 10409 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-0125-preview | InJulia | 1SHOT | false | false | 5 | 20240201_112000__206 | 0 | 0.01366 | 27.5364 | 0 | [91, 425] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_112000__206.json | 0.0 | missing | missing | missing | |
| 10410 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_112116__578 | 0 | 0.0184 | 76.1433 | 2 | [91, 583] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_112116__578.json | 66.6667 | missing | missing | missing | |
| 10411 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_111416__863 | 0 | 0.00543 | 14.2443 | 0 | [126, 139] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_111416__863.json | 50.0 | missing | missing | missing | |
| 10412 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_111426__256 | 0 | 0.00867 | 10.0674 | 2 | [126, 247] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_111426__256.json | 66.6667 | missing | missing | missing | |
| 10413 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_111438__905 | 0 | 0.00459 | 12.3544 | 2 | [126, 111] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_111438__905.json | 66.6667 | missing | missing | missing | |
| 10414 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_111448__510 | 0 | 0.00417 | 10.1797 | 0 | [126, 97] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_111448__510.json | 50.0 | missing | missing | missing | |
| 10415 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_111459__700 | 0 | 0.00564 | 11.0391 | 2 | [126, 146] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_111459__700.json | 66.6667 | missing | missing | missing | |
| 10416 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_111050__179 | 0 | 0.00667 | 9.60227 | 0 | [208, 153] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_111050__179.json | 0.0 | missing | missing | missing | |
| 10417 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_111134__402 | 0 | 0.01519 | 44.1364 | 2 | [208, 437] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_111134__402.json | 66.6667 | missing | missing | missing | |
| 10418 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_111153__875 | 0 | 0.0112 | 18.5734 | 0 | [208, 304] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_111153__875.json | 0.0 | missing | missing | missing | |
| 10419 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_111238__698 | 0 | 0.01672 | 45.1352 | 0 | [208, 488] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_111238__698.json | 50.0 | missing | missing | missing | |
| 10420 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_111310__772 | 0 | 0.01105 | 31.788 | 0 | [208, 299] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_111310__772.json | 0.0 | missing | missing | missing | |
| 10421 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_113149__599 | 0 | 0.02032 | 34.3014 | 3 | [349, 561] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_113149__599.json | 75.0 | missing | missing | missing | |
| 10422 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_113306__523 | 0 | 0.02077 | 76.3142 | 0 | [349, 576] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_113306__523.json | 50.0 | missing | missing | missing | |
| 10423 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_113348__584 | 0 | 0.01864 | 42.3783 | 3 | [349, 505] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_113348__584.json | 75.0 | missing | missing | missing | |
| 10424 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_113420__365 | 0 | 0.01933 | 32.3287 | 3 | [349, 528] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_113420__365.json | 75.0 | missing | missing | missing | |
| 10425 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_113501__995 | 0 | 0.02077 | 40.2867 | 0 | [349, 576] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_113501__995.json | 0.0 | missing | missing | missing | |
| 10426 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_112511__137 | 0 | 0.01821 | 44.8601 | 3 | [348, 491] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_112511__137.json | 75.0 | missing | missing | missing | |
| 10427 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_112600__700 | 0 | 0.02526 | 48.8766 | 3 | [348, 726] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_112600__700.json | 75.0 | missing | missing | missing | |
| 10428 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_112635__269 | 0 | 0.01596 | 34.6511 | 0 | [348, 416] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_112635__269.json | 0.0 | missing | missing | missing | |
| 10429 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_112701__469 | 0 | 0.01776 | 26.6497 | 3 | [348, 476] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_112701__469.json | 75.0 | missing | missing | missing | |
| 10430 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_112753__250 | 0 | 0.02133 | 51.3912 | 2 | [348, 595] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_112753__250.json | 66.6667 | missing | missing | missing | |
| 10431 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 5 | 20231213_205033__364 | 0 | 0.01417 | 46.3039 | 0 | [88, 443] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231213_205033__364.json | 0.0 | missing | missing | missing | |
| 10432 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 5 | 20231225_225918__951 | 0 | 0.01657 | 17.7497 | 0 | [88, 523] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_225918__951.json | 0.0 | missing | missing | missing | |
| 10433 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 5 | 20231225_225937__988 | 0 | 0.01537 | 19.2243 | 0 | [88, 483] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_225937__988.json | 0.0 | missing | missing | missing | |
| 10434 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview--optim | AsIs | 1SHOT | false | false | 5 | 20231215_201231__134 | 0 | 0.0 | 28.2058 | 0 | [88, 396] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231215_201231__134.json | 0.0 | 0.1 | missing | 0.9 | |
| 10435 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231213_204946__873 | 0 | 0.01921 | 45.8849 | 2 | [91, 610] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231213_204946__873.json | 66.6667 | missing | missing | missing | |
| 10436 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231225_225841__956 | 0 | 0.01432 | 15.8511 | 2 | [91, 447] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_225841__956.json | 66.6667 | missing | missing | missing | |
| 10437 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231225_225900__811 | 0 | 0.01492 | 18.2085 | 2 | [91, 467] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_225900__811.json | 66.6667 | missing | missing | missing | |
| 10438 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231227_204649__833 | 0 | 0.01438 | 38.6617 | 0 | [91, 449] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_204649__833.json | 50.0 | missing | missing | missing | |
| 10439 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | InJulia | 1SHOT | false | false | 5 | 20231227_204732__511 | 0 | 0.01591 | 42.1858 | 0 | [91, 500] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_204732__511.json | 0.0 | missing | missing | missing | |
| 10440 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview--optim | InJulia | 1SHOT | true | true | 5 | 20231215_201203__482 | 0 | 0.0 | 30.6133 | 2 | [91, 449] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231215_201203__482.json | 66.6667 | 0.1 | missing | 0.9 | |
| 10441 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_204900__805 | 0 | 0.00501 | 14.0962 | 0 | [126, 125] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231213_204900__805.json | 50.0 | missing | missing | missing | |
| 10442 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_225817__919 | 0 | 0.00831 | 8.61903 | 2 | [126, 235] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_225817__919.json | 66.6667 | missing | missing | missing | |
| 10443 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_225825__284 | 0 | 0.00723 | 8.09745 | 2 | [126, 199] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_225825__284.json | 66.6667 | missing | missing | missing | |
| 10444 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_204600__128 | 0 | 0.00762 | 11.7765 | 0 | [126, 212] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_204600__128.json | 25.0 | missing | missing | missing | |
| 10445 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_204611__162 | 0 | 0.00642 | 11.0597 | 0 | [126, 172] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_204611__162.json | 25.0 | missing | missing | missing | |
| 10446 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_201132__923 | 0 | 0.0 | 15.4109 | 2 | [126, 207] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231215_201132__923.json | 66.6667 | 0.1 | missing | 0.9 | |
| 10447 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_204846__806 | 0 | 0.01264 | 23.515 | 2 | [208, 352] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231213_204846__806.json | 66.6667 | missing | missing | missing | |
| 10448 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_225758__269 | 0 | 0.01156 | 16.0659 | 2 | [208, 316] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_225758__269.json | 66.6667 | missing | missing | missing | |
| 10449 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_225808__511 | 0 | 0.0115 | 10.7009 | 0 | [208, 314] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_225808__511.json | 0.0 | missing | missing | missing | |
| 10450 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_204523__287 | 0 | 0.00748 | 27.0266 | 0 | [208, 180] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_204523__287.json | 25.0 | missing | missing | missing | |
| 10451 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_204548__156 | 0 | 0.01003 | 24.4077 | 0 | [208, 265] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_204548__156.json | 25.0 | missing | missing | missing | |
| 10452 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_201116__664 | 0 | 0.0 | 25.0201 | 2 | [208, 302] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231215_201116__664.json | 66.6667 | 0.1 | missing | 0.9 | |
| 10453 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_205146__771 | 0 | 0.01945 | 41.6548 | 3 | [349, 532] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231213_205146__771.json | 75.0 | missing | missing | missing | |
| 10454 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_230028__857 | 0 | 0.01975 | 19.3426 | 3 | [349, 542] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_230028__857.json | 75.0 | missing | missing | missing | |
| 10455 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_230055__853 | 0 | 0.01699 | 27.3104 | 3 | [349, 450] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_230055__853.json | 75.0 | missing | missing | missing | |
| 10456 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_204910__246 | 0 | 0.0103 | 16.5825 | 0 | [349, 227] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_204910__246.json | 50.0 | missing | missing | missing | |
| 10457 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_204958__689 | 0 | 0.02245 | 48.3833 | 3 | [349, 632] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_204958__689.json | 75.0 | missing | missing | missing | |
| 10458 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_201415__464 | 0 | 0.0 | 57.9328 | 3 | [349, 499] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231215_201415__464.json | 75.0 | 0.1 | missing | 0.9 | |
| 10459 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_205104__818 | 0 | 0.0153 | 30.8231 | 3 | [348, 394] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231213_205104__818.json | 75.0 | missing | missing | missing | |
| 10460 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_225955__787 | 0 | 0.01815 | 17.6787 | 1 | [348, 489] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_225955__787.json | 58.3333 | missing | missing | missing | |
| 10461 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_230008__723 | 0 | 0.0144 | 13.2595 | 3 | [348, 364] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_230008__723.json | 75.0 | missing | missing | missing | |
| 10462 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_204812__767 | 0 | 0.02382 | 40.6552 | 1 | [348, 678] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_204812__767.json | 58.3333 | missing | missing | missing | |
| 10463 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_204853__331 | 0 | 0.01866 | 40.3072 | 0 | [348, 506] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_204853__331.json | 50.0 | missing | missing | missing | |
| 10464 | Apple-MacBook-Pro-M1 | q_and_a_extractor | gpt-4-1106-preview--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_201317__725 | 0 | 0.0 | 45.4066 | 3 | [348, 508] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231215_201317__725.json | 75.0 | 0.1 | missing | 0.9 | |
| 10465 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | AsIs | 1SHOT | false | false | 5 | 20231214_082517__823 | 0 | 0.0 | 17.0373 | 0 | [82, 496] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__AsIs__1SHOT__20231214_082517__823.json | 0.0 | missing | missing | missing | |
| 10466 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | AsIs | 1SHOT | false | false | 5 | 20231225_093813__896 | 0 | 0.0 | 17.1944 | 0 | [82, 504] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__AsIs__1SHOT__20231225_093813__896.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10467 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | AsIs | 1SHOT | false | false | 5 | 20231225_093826__896 | 0 | 0.0 | 12.9006 | 0 | [1, 398] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__AsIs__1SHOT__20231225_093826__896.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10468 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | InJulia | 1SHOT | true | true | 5 | 20231214_082500__225 | 0 | 0.0 | 14.194 | 0 | [99, 411] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__InJulia__1SHOT__20231214_082500__225.json | 50.0 | missing | missing | missing | |
| 10469 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | InJulia | 1SHOT | true | true | 5 | 20231225_093736__274 | 0 | 0.0 | 28.721 | 0 | [99, 806] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__InJulia__1SHOT__20231225_093736__274.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10470 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | InJulia | 1SHOT | true | true | 5 | 20231225_093755__702 | 0 | 0.0 | 19.352 | 0 | [1, 578] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__InJulia__1SHOT__20231225_093755__702.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10471 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | InJulia | 1SHOT | true | true | 5 | 20231227_022112__800 | 0 | 0.0 | 10.0206 | 0 | [99, 295] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__InJulia__1SHOT__20231227_022112__800.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10472 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | InJulia | 1SHOT | true | true | 5 | 20231227_080554__504 | 0 | 0.0 | 21.2807 | 0 | [99, 614] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__InJulia__1SHOT__20231227_080554__504.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10473 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_082446__587 | 0 | 0.0 | 15.4985 | 0 | [128, 440] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaExpertAsk__1SHOT__20231214_082446__587.json | 0.0 | missing | missing | missing | |
| 10474 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_093658__707 | 0 | 0.0 | 6.29249 | 0 | [128, 173] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_093658__707.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10475 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_093707__367 | 0 | 0.0 | 8.54217 | 0 | [1, 266] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_093707__367.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10476 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_022102__522 | 0 | 0.0 | 11.2893 | 0 | [128, 326] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaExpertAsk__1SHOT__20231227_022102__522.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10477 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_080531__308 | 0 | 0.0 | 10.1158 | 0 | [128, 287] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaExpertAsk__1SHOT__20231227_080531__308.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10478 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_082431__596 | 0 | 0.0 | 20.6258 | 0 | [229, 537] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_082431__596.json | 50.0 | missing | missing | missing | |
| 10479 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_093640__413 | 0 | 0.0 | 21.9471 | 2 | [247, 436] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_093640__413.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10480 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_093652__233 | 0 | 0.0 | 12.0759 | 0 | [1, 356] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_093652__233.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10481 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_022051__506 | 0 | 0.0 | 23.1667 | 0 | [247, 485] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_022051__506.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10482 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_080521__220 | 0 | 0.0 | 23.107 | 0 | [247, 373] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_080521__220.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10483 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_082600__493 | 0 | 0.0 | 19.5082 | 0 | [11, 525] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_082600__493.json | 0.0 | missing | missing | missing | |
| 10484 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_093930__737 | 0 | 0.0 | 21.6914 | 0 | [11, 582] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_093930__737.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10485 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_093951__414 | 0 | 0.0 | 20.4782 | 0 | [1, 557] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_093951__414.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10486 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_022215__443 | 0 | 0.0 | 24.5559 | 0 | [11, 659] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_022215__443.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10487 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_080621__394 | 0 | 0.0 | 4.80478 | 0 | [11, 135] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_080621__394.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10488 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_082541__575 | 0 | 0.0 | 23.0249 | 0 | [399, 525] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaRecapTask__1SHOT__20231214_082541__575.json | 50.0 | missing | missing | missing | |
| 10489 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_093846__992 | 0 | 0.0 | 19.9981 | 0 | [399, 450] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_093846__992.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10490 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_093908__419 | 0 | 0.0 | 22.766 | 0 | [1, 613] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_093908__419.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10491 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_022150__816 | 0 | 0.0 | 37.8367 | 0 | [399, 892] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaRecapTask__1SHOT__20231227_022150__816.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10492 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_080616__576 | 0 | 0.0 | 21.8456 | 0 | [399, 502] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaRecapTask__1SHOT__20231227_080616__576.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10493 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | AsIs | 1SHOT | false | false | 5 | 20231214_083641__433 | 0 | 0.0 | 17.7323 | 0 | [82, 516] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__AsIs__1SHOT__20231214_083641__433.json | 0.0 | missing | missing | missing | |
| 10494 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | AsIs | 1SHOT | false | false | 5 | 20231225_100512__506 | 0 | 0.0 | 11.8025 | 0 | [96, 386] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__AsIs__1SHOT__20231225_100512__506.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10495 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | AsIs | 1SHOT | false | false | 5 | 20231225_100518__865 | 0 | 0.0 | 6.13535 | 0 | [96, 197] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__AsIs__1SHOT__20231225_100518__865.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10496 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | InJulia | 1SHOT | false | false | 5 | 20231214_083623__729 | 0 | 0.0 | 20.608 | 0 | [99, 590] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__InJulia__1SHOT__20231214_083623__729.json | 0.0 | missing | missing | missing | |
| 10497 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_100452__820 | 0 | 0.0 | 9.60036 | 2 | [99, 309] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__InJulia__1SHOT__20231225_100452__820.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10498 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_100500__136 | 0 | 0.0 | 8.27445 | 1 | [99, 265] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__InJulia__1SHOT__20231225_100500__136.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10499 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | InJulia | 1SHOT | true | true | 5 | 20231227_023242__227 | 0 | 0.0 | 10.6637 | 1 | [99, 343] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__InJulia__1SHOT__20231227_023242__227.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10500 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | InJulia | 1SHOT | true | true | 5 | 20231227_081747__479 | 0 | 0.0 | 7.57262 | 2 | [99, 240] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__InJulia__1SHOT__20231227_081747__479.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10501 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_083602__361 | 0 | 0.0 | 2.72257 | 0 | [128, 59] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231214_083602__361.json | 50.0 | missing | missing | missing | |
| 10502 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_100427__371 | 0 | 0.0 | 12.1747 | 0 | [138, 384] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_100427__371.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10503 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_100442__925 | 0 | 0.0 | 14.3777 | 0 | [138, 454] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_100442__925.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10504 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_023231__146 | 0 | 0.0 | 9.88232 | 2 | [138, 312] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231227_023231__146.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10505 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_081740__369 | 0 | 0.0 | 7.64161 | 0 | [138, 236] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231227_081740__369.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10506 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_083559__846 | 0 | 0.0 | 26.5545 | 0 | [229, 690] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231214_083559__846.json | 50.0 | missing | missing | missing | |
| 10507 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_100406__545 | 0 | 0.0 | 14.967 | 0 | [239, 255] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_100406__545.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10508 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_100415__864 | 0 | 0.0 | 8.99197 | 2 | [239, 262] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_100415__864.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10509 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_023221__803 | 0 | 0.0 | 14.8256 | 0 | [239, 267] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231227_023221__803.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10510 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_081732__292 | 0 | 0.0 | 18.985 | 0 | [239, 400] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231227_081732__292.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10511 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_083734__729 | 0 | 0.0 | 29.4308 | 0 | [11, 767] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231214_083734__729.json | 50.0 | missing | missing | missing | |
| 10512 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_100552__862 | 0 | 0.0 | 14.1844 | 2 | [402, 398] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_100552__862.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10513 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_100602__770 | 0 | 0.0 | 9.70611 | 2 | [402, 259] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_100602__770.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10514 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_081808__499 | 0 | 0.0 | 8.51722 | 0 | [402, 217] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231227_081808__499.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10515 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_083704__958 | 0 | 0.0 | 23.7742 | 0 | [399, 544] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaRecapTask__1SHOT__20231214_083704__958.json | 50.0 | missing | missing | missing | |
| 10516 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_100528__669 | 0 | 0.0 | 9.91606 | 2 | [399, 265] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_100528__669.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10517 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_100538__655 | 0 | 0.0 | 9.26251 | 3 | [399, 245] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_100538__655.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10518 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_081800__869 | 0 | 0.0 | 12.579 | 0 | [399, 344] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaRecapTask__1SHOT__20231227_081800__869.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10519 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_182630__968 | 0 | 0.0 | 20.1125 | 0 | [99, 373] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_182630__968.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10520 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_182647__648 | 0 | 0.0 | 16.794 | 3 | [99, 321] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_182647__648.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10521 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_182705__820 | 0 | 0.0 | 17.8197 | 0 | [99, 341] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_182705__820.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10522 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_182534__861 | 0 | 0.0 | 19.7071 | 3 | [138, 368] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_182534__861.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10523 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_182550__300 | 0 | 0.0 | 15.8922 | 0 | [138, 295] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_182550__300.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10524 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_182610__864 | 0 | 0.0 | 19.6312 | 2 | [138, 366] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_182610__864.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10525 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_182441__308 | 0 | 0.0 | 19.5005 | 0 | [239, 356] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_182441__308.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10526 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_182458__436 | 0 | 0.0 | 17.0908 | 0 | [239, 302] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_182458__436.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10527 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_182515__992 | 0 | 0.0 | 16.8914 | 3 | [239, 302] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_182515__992.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10528 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_182825__195 | 0 | 0.0 | 30.765 | 2 | [402, 525] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_182825__195.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10529 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_182844__657 | 0 | 0.0 | 19.224 | 3 | [402, 328] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_182844__657.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10530 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_182913__575 | 0 | 0.0 | 28.4236 | 2 | [402, 501] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_182913__575.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10531 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_182717__997 | 0 | 0.0 | 11.9921 | 0 | [399, 191] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_182717__997.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10532 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_182735__949 | 0 | 0.0 | 17.3796 | 2 | [399, 294] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_182735__949.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10533 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_182754__580 | 0 | 0.0 | 19.2066 | 0 | [399, 328] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_182754__580.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10534 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | AsIs | 1SHOT | false | false | 5 | 20231213_205457__160 | 0 | 0.00352216 | 8.93738 | 0 | [94, 404] | 0.10.0-DEV | 3 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__AsIs__1SHOT__20231213_205457__160.json | 0.0 | missing | missing | missing | |
| 10535 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | AsIs | 1SHOT | false | false | 5 | 20231225_230531__135 | 0 | 0.00310957 | 11.7827 | 0 | [94, 353] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__AsIs__1SHOT__20231225_230531__135.json | 0.0 | missing | missing | missing | |
| 10536 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | AsIs | 1SHOT | false | false | 5 | 20231225_230600__554 | 0 | 0.00309339 | 28.9507 | 0 | [94, 351] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__AsIs__1SHOT__20231225_230600__554.json | 0.0 | missing | missing | missing | |
| 10537 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium--optim | AsIs | 1SHOT | false | false | 5 | 20231215_201623__726 | 0 | 0.0 | 11.1491 | 0 | [94, 504] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__AsIs__1SHOT__20231215_201623__726.json | 0.0 | 0.9 | missing | 0.3 | |
| 10538 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | InJulia | 1SHOT | true | false | 5 | 20231213_205448__231 | 0 | 0.00827909 | 60.4732 | 0 | [97, 991] | 0.10.0-DEV | 3 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__InJulia__1SHOT__20231213_205448__231.json | 25.0 | missing | missing | missing | |
| 10539 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | InJulia | 1SHOT | true | false | 5 | 20231225_230454__296 | 0 | 0.00469522 | 17.3021 | 0 | [97, 548] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__InJulia__1SHOT__20231225_230454__296.json | 25.0 | missing | missing | missing | |
| 10540 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | InJulia | 1SHOT | true | false | 5 | 20231225_230519__244 | 0 | 0.00476803 | 24.1893 | 0 | [97, 557] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__InJulia__1SHOT__20231225_230519__244.json | 25.0 | missing | missing | missing | |
| 10541 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | InJulia | 1SHOT | true | false | 5 | 20231227_205441__807 | 0 | 0.00425027 | 23.2383 | 0 | [97, 493] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__InJulia__1SHOT__20231227_205441__807.json | 25.0 | missing | missing | missing | |
| 10542 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231227_205453__101 | 0 | 0.00324711 | 11.188 | 2 | [97, 369] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__InJulia__1SHOT__20231227_205453__101.json | 66.6667 | missing | missing | missing | |
| 10543 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium--optim | InJulia | 1SHOT | true | false | 5 | 20231215_201612__224 | 0 | 0.0 | 28.9475 | 0 | [97, 711] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__InJulia__1SHOT__20231215_201612__224.json | 25.0 | 0.9 | missing | 0.3 | |
| 10544 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231213_205348__699 | 0 | 0.00480861 | 20.5368 | 0 | [136, 549] | 0.10.0-DEV | 3 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231213_205348__699.json | 25.0 | missing | missing | missing | |
| 10545 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_230427__350 | 0 | 0.00517266 | 24.0023 | 0 | [136, 594] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_230427__350.json | 25.0 | missing | missing | missing | |
| 10546 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_230437__548 | 0 | 0.00278611 | 10.0868 | 0 | [136, 299] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_230437__548.json | 25.0 | missing | missing | missing | |
| 10547 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_205402__832 | 0 | 0.00281038 | 13.4005 | 0 | [136, 302] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_205402__832.json | 25.0 | missing | missing | missing | |
| 10548 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_205418__222 | 0 | 0.00454973 | 15.7252 | 0 | [136, 517] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_205418__222.json | 25.0 | missing | missing | missing | |
| 10549 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium--optim | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231215_201543__616 | 0 | 0.0 | 8.33034 | 0 | [136, 375] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231215_201543__616.json | 25.0 | 0.9 | missing | 0.3 | |
| 10550 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231213_205327__150 | 0 | 0.00440175 | 15.8288 | 0 | [237, 465] | 0.10.0-DEV | 3 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231213_205327__150.json | 25.0 | missing | missing | missing | |
| 10551 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_230324__829 | 0 | 0.00496805 | 17.2984 | 0 | [237, 535] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_230324__829.json | 25.0 | missing | missing | missing | |
| 10552 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_230403__222 | 0 | 0.00620582 | 38.2692 | 0 | [237, 688] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_230403__222.json | 25.0 | missing | missing | missing | |
| 10553 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_205249__227 | 0 | 0.00497614 | 53.7845 | 0 | [237, 536] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_205249__227.json | 50.0 | missing | missing | missing | |
| 10554 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_205349__905 | 0 | 0.00555053 | 59.3882 | 0 | [237, 607] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_205349__905.json | 25.0 | missing | missing | missing | |
| 10555 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium--optim | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231215_201534__882 | 0 | 0.0 | 17.4374 | 0 | [237, 526] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231215_201534__882.json | 25.0 | 0.9 | missing | 0.3 | |
| 10556 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_205556__188 | 0 | 0.00747649 | 39.073 | 3 | [399, 791] | 0.10.0-DEV | 3 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231213_205556__188.json | 75.0 | missing | missing | missing | |
| 10557 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_230732__326 | 0 | 0.00822077 | 20.3986 | 0 | [399, 883] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_230732__326.json | 25.0 | missing | missing | missing | |
| 10558 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_230752__861 | 0 | 0.00489578 | 19.5408 | 3 | [399, 472] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_230752__861.json | 75.0 | missing | missing | missing | |
| 10559 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_205608__873 | 0 | 0.00568051 | 29.0409 | 0 | [399, 569] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_205608__873.json | 25.0 | missing | missing | missing | |
| 10560 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_205645__320 | 0 | 0.00697491 | 36.0381 | 3 | [399, 729] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_205645__320.json | 75.0 | missing | missing | missing | |
| 10561 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_201716__617 | 0 | 0.0 | 32.0649 | 3 | [399, 814] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231215_201716__617.json | 75.0 | 0.9 | missing | 0.3 | |
| 10562 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | JuliaRecapTask | 1SHOT | true | false | 5 | 20231213_205516__577 | 0 | 0.00771918 | 18.8892 | 0 | [396, 822] | 0.10.0-DEV | 3 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231213_205516__577.json | 25.0 | missing | missing | missing | |
| 10563 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_230656__320 | 0 | 0.00682119 | 56.2858 | 3 | [396, 711] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_230656__320.json | 75.0 | missing | missing | missing | |
| 10564 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_230712__544 | 0 | 0.00619826 | 14.6414 | 0 | [396, 634] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_230712__544.json | 50.0 | missing | missing | missing | |
| 10565 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_205503__556 | 0 | 0.00461262 | 10.1996 | 3 | [396, 438] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_205503__556.json | 75.0 | missing | missing | missing | |
| 10566 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_205539__285 | 0 | 0.00886796 | 35.2004 | 0 | [396, 964] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_205539__285.json | 25.0 | missing | missing | missing | |
| 10567 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-medium--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_201644__749 | 0 | 0.0 | 21.0218 | 2 | [396, 913] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231215_201644__749.json | 66.6667 | 0.9 | missing | 0.3 | |
| 10568 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | AsIs | 1SHOT | false | false | 5 | 20231213_205253__975 | 0 | 0.000881439 | 5.63154 | 0 | [97, 422] | 0.10.0-DEV | 3 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__AsIs__1SHOT__20231213_205253__975.json | 0.0 | missing | missing | missing | |
| 10569 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | AsIs | 1SHOT | false | false | 5 | 20231225_230231__680 | 0 | 0.000113199 | 0.527859 | 0 | [97, 26] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__AsIs__1SHOT__20231225_230231__680.json | 0.0 | missing | missing | missing | |
| 10570 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | AsIs | 1SHOT | false | false | 5 | 20231225_230237__195 | 0 | 0.000896959 | 5.88279 | 0 | [97, 430] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__AsIs__1SHOT__20231225_230237__195.json | 0.0 | missing | missing | missing | |
| 10571 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small--optim | AsIs | 1SHOT | false | false | 5 | 20231215_201504__794 | 0 | 0.0 | 3.79975 | 0 | [97, 285] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__AsIs__1SHOT__20231215_201504__794.json | 0.0 | 0.9 | missing | 0.3 | |
| 10572 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231213_205247__802 | 0 | 0.00093576 | 5.95786 | 0 | [100, 449] | 0.10.0-DEV | 3 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__InJulia__1SHOT__20231213_205247__802.json | 50.0 | missing | missing | missing | |
| 10573 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231225_230223__530 | 0 | 0.00096098 | 6.19906 | 3 | [100, 462] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__InJulia__1SHOT__20231225_230223__530.json | 75.0 | missing | missing | missing | |
| 10574 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231225_230231__411 | 0 | 0.00100172 | 6.51122 | 0 | [100, 483] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__InJulia__1SHOT__20231225_230231__411.json | 50.0 | missing | missing | missing | |
| 10575 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231227_205120__117 | 0 | 0.0010638 | 6.89236 | 1 | [100, 515] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__InJulia__1SHOT__20231227_205120__117.json | 58.3333 | missing | missing | missing | |
| 10576 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231227_205127__293 | 0 | 0.00107544 | 7.08802 | 3 | [100, 521] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__InJulia__1SHOT__20231227_205127__293.json | 75.0 | missing | missing | missing | |
| 10577 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small--optim | InJulia | 1SHOT | true | true | 5 | 20231215_201500__425 | 0 | 0.0 | 8.68422 | 3 | [100, 525] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__InJulia__1SHOT__20231215_201500__425.json | 75.0 | 0.9 | missing | 0.3 | |
| 10578 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_205241__981 | 0 | 0.000657707 | 3.93437 | 1 | [141, 292] | 0.10.0-DEV | 3 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231213_205241__981.json | 58.3333 | missing | missing | missing | |
| 10579 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_230210__783 | 0 | 0.000543247 | 3.32363 | 3 | [141, 233] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_230210__783.json | 75.0 | missing | missing | missing | |
| 10580 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_230217__612 | 0 | 0.00102049 | 6.43145 | 3 | [141, 479] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_230217__612.json | 75.0 | missing | missing | missing | |
| 10581 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_205106__719 | 0 | 0.000768287 | 5.1578 | 1 | [141, 349] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_205106__719.json | 58.3333 | missing | missing | missing | |
| 10582 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_205113__350 | 0 | 0.00103601 | 6.58957 | 1 | [141, 487] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_205113__350.json | 58.3333 | missing | missing | missing | |
| 10583 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_201451__678 | 0 | 0.0 | 3.79555 | 1 | [141, 284] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231215_201451__678.json | 58.3333 | 0.9 | missing | 0.3 | |
| 10584 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_205237__497 | 0 | 0.000321474 | 1.26306 | 0 | [242, 85] | 0.10.0-DEV | 3 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231213_205237__497.json | 0.0 | missing | missing | missing | |
| 10585 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_230202__994 | 0 | 0.000868554 | 5.0988 | 0 | [242, 367] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_230202__994.json | 50.0 | missing | missing | missing | |
| 10586 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_230207__875 | 0 | 0.000851094 | 4.9288 | 0 | [242, 358] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_230207__875.json | 50.0 | missing | missing | missing | |
| 10587 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_205055__775 | 0 | 0.000899594 | 5.26817 | 0 | [242, 383] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_205055__775.json | 50.0 | missing | missing | missing | |
| 10588 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_205101__719 | 0 | 0.000934514 | 5.51094 | 3 | [242, 401] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_205101__719.json | 75.0 | missing | missing | missing | |
| 10589 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_201447__261 | 0 | 0.0 | 7.82935 | 0 | [242, 586] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231215_201447__261.json | 50.0 | 0.9 | missing | 0.3 | |
| 10590 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_205311__241 | 0 | 0.00128959 | 7.17722 | 3 | [407, 529] | 0.10.0-DEV | 3 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231213_205311__241.json | 75.0 | missing | missing | missing | |
| 10591 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_230305__696 | 0 | 0.00122363 | 6.74286 | 2 | [407, 495] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_230305__696.json | 66.6667 | missing | missing | missing | |
| 10592 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_230307__664 | 0 | 0.000439869 | 1.45096 | 0 | [407, 91] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_230307__664.json | 0.0 | missing | missing | missing | |
| 10593 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_205153__973 | 0 | 0.00158059 | 9.33498 | 0 | [407, 679] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_205153__973.json | 25.0 | missing | missing | missing | |
| 10594 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_205155__804 | 0 | 0.000544629 | 2.11807 | 0 | [407, 145] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_205155__804.json | 0.0 | missing | missing | missing | |
| 10595 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_201516__810 | 0 | 0.0 | 5.94771 | 3 | [407, 444] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231215_201516__810.json | 75.0 | 0.9 | missing | 0.3 | |
| 10596 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_205304__592 | 0 | 0.00173644 | 10.3524 | 0 | [405, 760] | 0.10.0-DEV | 3 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231213_205304__592.json | 0.0 | missing | missing | missing | |
| 10597 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_230245__774 | 0 | 0.00136201 | 7.84885 | 0 | [405, 567] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_230245__774.json | 25.0 | missing | missing | missing | |
| 10598 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_230258__773 | 0 | 0.00202162 | 12.5804 | 0 | [405, 907] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_230258__773.json | 0.0 | missing | missing | missing | |
| 10599 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_205135__426 | 0 | 0.00141245 | 8.02349 | 0 | [405, 593] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_205135__426.json | 0.0 | missing | missing | missing | |
| 10600 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_205143__256 | 0 | 0.00140663 | 8.02485 | 2 | [405, 590] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_205143__256.json | 66.6667 | missing | missing | missing | |
| 10601 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-small--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_201510__448 | 0 | 0.0 | 5.92632 | 3 | [405, 438] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231215_201510__448.json | 75.0 | 0.9 | missing | 0.3 | |
| 10602 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231213_205216__775 | 0 | 0.000186173 | 6.33548 | 0 | [97, 381] | 0.10.0-DEV | 3 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__AsIs__1SHOT__20231213_205216__775.json | 0.0 | missing | missing | missing | |
| 10603 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231225_230130__229 | 0 | 0.00019025 | 3.39454 | 0 | [97, 390] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__AsIs__1SHOT__20231225_230130__229.json | 0.0 | missing | missing | missing | |
| 10604 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231225_230134__280 | 0 | 0.000209729 | 3.85297 | 0 | [97, 433] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__AsIs__1SHOT__20231225_230134__280.json | 0.0 | missing | missing | missing | |
| 10605 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny--optim | AsIs | 1SHOT | false | false | 5 | 20231215_201431__713 | 0 | 0.0 | 2.38031 | 0 | [97, 282] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__AsIs__1SHOT__20231215_201431__713.json | 0.0 | 0.9 | missing | 0.3 | |
| 10606 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231213_205210__325 | 0 | 0.000302561 | 10.9469 | 0 | [100, 637] | 0.10.0-DEV | 3 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__InJulia__1SHOT__20231213_205210__325.json | 50.0 | missing | missing | missing | |
| 10607 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231225_230122__827 | 0 | 0.000286706 | 5.27622 | 0 | [100, 602] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__InJulia__1SHOT__20231225_230122__827.json | 50.0 | missing | missing | missing | |
| 10608 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231225_230127__919 | 0 | 0.000271757 | 4.91779 | 3 | [100, 569] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__InJulia__1SHOT__20231225_230127__919.json | 75.0 | missing | missing | missing | |
| 10609 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231227_205022__311 | 0 | 0.000251825 | 4.71834 | 2 | [100, 525] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__InJulia__1SHOT__20231227_205022__311.json | 66.6667 | missing | missing | missing | |
| 10610 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231227_205027__890 | 0 | 0.000253184 | 4.71561 | 3 | [100, 528] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__InJulia__1SHOT__20231227_205027__890.json | 75.0 | missing | missing | missing | |
| 10611 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny--optim | InJulia | 1SHOT | false | false | 5 | 20231215_201428__823 | 0 | 0.0 | 4.4729 | 0 | [100, 540] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__InJulia__1SHOT__20231215_201428__823.json | 0.0 | 0.9 | missing | 0.3 | |
| 10612 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231213_205159__801 | 0 | 0.000154281 | 5.46353 | 0 | [141, 297] | 0.10.0-DEV | 3 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231213_205159__801.json | 0.0 | missing | missing | missing | |
| 10613 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_230112__418 | 0 | 0.000170589 | 2.92314 | 0 | [141, 333] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_230112__418.json | 0.0 | missing | missing | missing | |
| 10614 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_230116__207 | 0 | 0.00023265 | 4.19588 | 0 | [141, 470] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_230116__207.json | 50.0 | missing | missing | missing | |
| 10615 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_205014__889 | 0 | 0.000184632 | 3.26241 | 0 | [141, 364] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_205014__889.json | 50.0 | missing | missing | missing | |
| 10616 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_205017__382 | 0 | 0.000135255 | 2.34373 | 0 | [141, 255] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_205017__382.json | 0.0 | missing | missing | missing | |
| 10617 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny--optim | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231215_201424__604 | 0 | 0.0 | 3.08598 | 0 | [141, 359] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231215_201424__604.json | 0.0 | 0.9 | missing | 0.3 | |
| 10618 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_205153__563 | 0 | 0.000188806 | 7.12034 | 1 | [242, 342] | 0.10.0-DEV | 3 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231213_205153__563.json | 58.3333 | missing | missing | missing | |
| 10619 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_230104__711 | 0 | 0.000258115 | 9.08888 | 0 | [242, 495] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_230104__711.json | 25.0 | missing | missing | missing | |
| 10620 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_230109__403 | 0 | 0.000249508 | 4.36931 | 0 | [242, 476] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_230109__403.json | 0.0 | missing | missing | missing | |
| 10621 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_205008__522 | 0 | 0.000191524 | 9.61856 | 0 | [242, 348] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_205008__522.json | 50.0 | missing | missing | missing | |
| 10622 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_205011__693 | 0 | 0.000181105 | 3.05243 | 0 | [242, 325] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_205011__693.json | 25.0 | missing | missing | missing | |
| 10623 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny--optim | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231215_201420__833 | 0 | 0.0 | 5.5449 | 0 | [242, 426] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231215_201420__833.json | 25.0 | 0.9 | missing | 0.3 | |
| 10624 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231213_205236__522 | 0 | 0.000407149 | 13.7966 | 0 | [407, 773] | 0.10.0-DEV | 3 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231213_205236__522.json | 25.0 | missing | missing | missing | |
| 10625 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_230151__676 | 0 | 0.000400807 | 6.70756 | 0 | [407, 759] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_230151__676.json | 0.0 | missing | missing | missing | |
| 10626 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_230156__957 | 0 | 0.000332857 | 5.51092 | 0 | [407, 609] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_230156__957.json | 50.0 | missing | missing | missing | |
| 10627 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_205044__710 | 0 | 0.000420739 | 7.61865 | 0 | [407, 803] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_205044__710.json | 50.0 | missing | missing | missing | |
| 10628 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_205050__177 | 0 | 0.000309754 | 5.19198 | 0 | [407, 558] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_205050__177.json | 25.0 | missing | missing | missing | |
| 10629 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_201439__758 | 0 | 0.0 | 5.15944 | 1 | [407, 592] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231215_201439__758.json | 58.3333 | 0.9 | missing | 0.3 | |
| 10630 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_205222__748 | 0 | 0.000210267 | 5.58867 | 1 | [405, 339] | 0.10.0-DEV | 3 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231213_205222__748.json | 58.3333 | missing | missing | missing | |
| 10631 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_230141__387 | 0 | 0.000386484 | 6.44402 | 0 | [405, 728] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_230141__387.json | 0.0 | missing | missing | missing | |
| 10632 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_230144__241 | 0 | 0.000214344 | 3.1393 | 2 | [405, 348] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_230144__241.json | 66.6667 | missing | missing | missing | |
| 10633 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_205031__847 | 0 | 0.000254208 | 4.06594 | 0 | [405, 436] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_205031__847.json | 0.0 | missing | missing | missing | |
| 10634 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_205037__921 | 0 | 0.000328047 | 5.62255 | 0 | [405, 599] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_205037__921.json | 50.0 | missing | missing | missing | |
| 10635 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral-tiny--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_201434__918 | 0 | 0.0 | 3.10805 | 1 | [405, 354] | 0.10.0-DEV | 3 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231215_201434__918.json | 58.3333 | 0.9 | missing | 0.3 | |
| 10636 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_113131__900 | 0 | 0.0 | 5.28649 | 0 | [96, 126] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_113131__900.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10637 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_113137__351 | 0 | 0.0 | 6.58109 | 0 | [96, 160] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_113137__351.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10638 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231225_113118__580 | 0 | 0.0 | 11.229 | 0 | [99, 277] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_113118__580.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10639 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231225_113125__853 | 0 | 0.0 | 7.46462 | 0 | [99, 179] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_113125__853.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10640 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_083442__537 | 0 | 0.0 | 11.2986 | 1 | [99, 277] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_083442__537.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10641 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_113057__872 | 0 | 0.0 | 6.26122 | 0 | [140, 143] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_113057__872.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10642 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_113106__813 | 0 | 0.0 | 9.71446 | 0 | [140, 232] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_113106__813.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10643 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_083431__270 | 0 | 0.0 | 7.60879 | 0 | [140, 177] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_083431__270.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10644 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_113035__160 | 0 | 0.0 | 23.9099 | 1 | [241, 435] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_113035__160.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10645 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_113050__207 | 0 | 0.0 | 14.8813 | 0 | [241, 348] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_113050__207.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10646 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_083423__118 | 0 | 0.0 | 16.6818 | 0 | [241, 257] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_083423__118.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10647 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_113218__493 | 0 | 0.0 | 9.43658 | 0 | [407, 185] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_113218__493.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10648 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_113235__286 | 0 | 0.0 | 17.5726 | 0 | [407, 388] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_113235__286.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10649 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_083506__278 | 0 | 0.0 | 8.58093 | 0 | [407, 163] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_083506__278.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10650 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_113149__367 | 0 | 0.0 | 11.4588 | 0 | [405, 236] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_113149__367.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10651 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_113208__112 | 0 | 0.0 | 19.3589 | 0 | [405, 432] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_113208__112.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10652 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_083457__629 | 0 | 0.0 | 14.773 | 0 | [405, 317] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_083457__629.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10653 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_002741__827 | 0 | 0.0 | 17.1914 | 0 | [98, 538] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_002741__827.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10654 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | false | 5 | 20231228_002758__192 | 0 | 0.0 | 16.4642 | 0 | [98, 515] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_002758__192.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10655 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_002811__713 | 0 | 0.0 | 12.8535 | 3 | [98, 400] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_002811__713.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10656 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | false | false | 5 | 20231228_002829__928 | 0 | 0.0 | 18.1338 | 0 | [98, 567] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_002829__928.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10657 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_002851__624 | 0 | 0.0 | 21.6234 | 0 | [98, 675] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_002851__624.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10658 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_002644__585 | 0 | 0.0 | 6.8126 | 3 | [139, 200] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_002644__585.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10659 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_002653__492 | 0 | 0.0 | 8.4774 | 2 | [139, 255] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_002653__492.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10660 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231228_002703__111 | 0 | 0.0 | 10.641 | 0 | [139, 324] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_002703__111.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10661 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_002715__352 | 0 | 0.0 | 11.1555 | 3 | [139, 341] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_002715__352.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10662 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231228_002724__214 | 0 | 0.0 | 9.27323 | 0 | [139, 280] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_002724__214.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10663 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231228_002528__719 | 0 | 0.0 | 16.8928 | 0 | [240, 476] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_002528__719.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10664 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231228_002546__692 | 0 | 0.0 | 17.8362 | 0 | [240, 530] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_002546__692.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10665 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_002600__213 | 0 | 0.0 | 13.2534 | 0 | [240, 389] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_002600__213.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10666 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_002614__859 | 0 | 0.0 | 14.2113 | 0 | [240, 418] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_002614__859.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10667 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231228_002637__763 | 0 | 0.0 | 22.8415 | 0 | [240, 681] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_002637__763.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10668 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_003106__963 | 0 | 0.0 | 19.5505 | 1 | [406, 547] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_003106__963.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10669 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_003125__335 | 0 | 0.0 | 18.3974 | 0 | [406, 512] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_003125__335.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10670 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231228_003148__407 | 0 | 0.0 | 22.8296 | 0 | [406, 644] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_003148__407.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10671 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231228_003204__582 | 0 | 0.0 | 16.789 | 0 | [406, 465] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_003204__582.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10672 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_003232__969 | 0 | 0.0 | 27.0076 | 0 | [406, 765] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_003232__969.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10673 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_002914__320 | 0 | 0.0 | 22.4806 | 1 | [404, 634] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_002914__320.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10674 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_002946__256 | 0 | 0.0 | 31.7954 | 0 | [404, 900] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_002946__256.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10675 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231228_003006__582 | 0 | 0.0 | 19.7944 | 0 | [404, 554] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_003006__582.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10676 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231228_003032__849 | 0 | 0.0 | 26.5627 | 0 | [404, 752] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_003032__849.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10677 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_003046__302 | 0 | 0.0 | 14.0232 | 0 | [404, 381] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_003046__302.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10678 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231228_003545__718 | 0 | 0.0 | 16.4636 | 0 | [98, 407] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_003545__718.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10679 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231228_003604__689 | 0 | 0.0 | 18.936 | 0 | [98, 469] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_003604__689.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10680 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231228_003630__107 | 0 | 0.0 | 25.4726 | 0 | [98, 631] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_003630__107.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10681 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_003651__690 | 0 | 0.0 | 21.219 | 0 | [98, 526] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_003651__690.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10682 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_003708__488 | 0 | 0.0 | 16.6165 | 0 | [98, 411] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_003708__488.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10683 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_003422__170 | 0 | 0.0 | 12.1397 | 0 | [139, 292] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_003422__170.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10684 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231228_003446__363 | 0 | 0.0 | 23.5476 | 0 | [139, 577] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_003446__363.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10685 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_003506__396 | 0 | 0.0 | 20.4588 | 3 | [139, 501] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_003506__396.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10686 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_003518__240 | 0 | 0.0 | 11.4106 | 0 | [139, 273] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_003518__240.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10687 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231228_003529__988 | 0 | 0.0 | 10.4152 | 0 | [139, 248] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_003529__988.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10688 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231228_003252__143 | 0 | 0.0 | 20.1184 | 0 | [240, 454] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_003252__143.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10689 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_003308__349 | 0 | 0.0 | 16.2048 | 2 | [240, 378] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_003308__349.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10690 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231228_003330__984 | 0 | 0.0 | 21.1189 | 0 | [240, 499] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_003330__984.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10691 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_003354__410 | 0 | 0.0 | 24.0177 | 0 | [240, 570] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_003354__410.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10692 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231228_003410__736 | 0 | 0.0 | 15.8898 | 0 | [240, 370] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_003410__736.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10693 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231228_003943__146 | 0 | 0.0 | 34.7883 | 0 | [406, 792] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_003943__146.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10694 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231228_003958__518 | 0 | 0.0 | 15.6094 | 0 | [406, 336] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_003958__518.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10695 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_004041__882 | 0 | 0.0 | 42.1535 | 0 | [406, 960] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_004041__882.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10696 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_004100__367 | 0 | 0.0 | 19.1591 | 0 | [406, 422] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_004100__367.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10697 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_004119__235 | 0 | 0.0 | 18.6671 | 0 | [406, 410] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_004119__235.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10698 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231228_003727__941 | 0 | 0.0 | 18.6675 | 0 | [404, 410] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_003727__941.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10699 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_003758__400 | 0 | 0.0 | 31.2924 | 0 | [404, 711] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_003758__400.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10700 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231228_003818__116 | 0 | 0.0 | 20.5867 | 0 | [404, 457] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_003818__116.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10701 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_003836__150 | 0 | 0.0 | 17.4545 | 1 | [404, 381] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_003836__150.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10702 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231228_003908__123 | 0 | 0.0 | 31.8039 | 0 | [404, 723] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_003908__123.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10703 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231226_123837__999 | 0 | 0.0 | 25.7276 | 0 | [95, 468] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_123837__999.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10704 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231226_123853__174 | 0 | 0.0 | 16.0226 | 0 | [95, 291] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_123853__174.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10705 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | false | 5 | 20231226_123720__312 | 0 | 0.0 | 20.6902 | 0 | [98, 370] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_123720__312.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10706 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231226_123811__479 | 0 | 0.0 | 50.6522 | 0 | [98, 904] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_123811__479.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10707 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_083854__936 | 0 | 0.0 | 30.2274 | 0 | [98, 552] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231227_083854__936.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10708 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_123643__213 | 0 | 0.0 | 16.4589 | 0 | [139, 293] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_123643__213.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10709 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_123659__129 | 0 | 0.0 | 16.6243 | 0 | [139, 295] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_123659__129.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10710 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_083824__711 | 0 | 0.0 | 24.3254 | 0 | [139, 439] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_083824__711.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10711 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_123558__231 | 0 | 0.0 | 18.7653 | 0 | [240, 318] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_123558__231.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10712 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_123626__720 | 0 | 0.0 | 28.0938 | 1 | [240, 485] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_123626__720.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10713 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_083759__903 | 0 | 0.0 | 28.8019 | 1 | [240, 348] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_083759__903.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10714 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_124027__812 | 0 | 0.0 | 28.4296 | 0 | [406, 473] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_124027__812.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10715 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231226_124056__362 | 0 | 0.0 | 28.3595 | 0 | [406, 469] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_124056__362.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10716 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_084016__307 | 0 | 0.0 | 39.4433 | 0 | [406, 673] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_084016__307.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10717 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_123914__648 | 0 | 0.0 | 21.0367 | 2 | [404, 346] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_123914__648.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10718 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_123959__426 | 0 | 0.0 | 44.7343 | 0 | [404, 744] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_123959__426.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10719 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_083936__668 | 0 | 0.0 | 42.4674 | 0 | [404, 726] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_083936__668.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10720 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_084450__966 | 0 | 0.0 | 62.9222 | 0 | [104, 367] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_084450__966.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10721 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_130656__369 | 0 | 0.0 | 78.4812 | 3 | [104, 460] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_130656__369.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10722 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_130821__420 | 0 | 0.0 | 84.4554 | 0 | [104, 496] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_130821__420.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10723 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_130929__145 | 0 | 0.0 | 68.3589 | 3 | [104, 400] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_130929__145.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10724 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_084347__351 | 0 | 0.0 | 31.534 | 3 | [143, 171] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_084347__351.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10725 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_130350__584 | 0 | 0.0 | 45.334 | 0 | [143, 254] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_130350__584.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10726 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_130500__827 | 0 | 0.0 | 70.153 | 0 | [143, 398] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_130500__827.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10727 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_130537__696 | 0 | 0.0 | 37.0932 | 0 | [143, 203] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_130537__696.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10728 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_084316__512 | 0 | 0.0 | 139.065 | 0 | [244, 652] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_084316__512.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10729 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_130024__970 | 0 | 0.0 | 107.177 | 2 | [244, 540] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_130024__970.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10730 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_130129__810 | 0 | 0.0 | 65.3052 | 0 | [244, 343] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_130129__810.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10731 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_130304__384 | 0 | 0.0 | 94.4254 | 1 | [244, 525] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_130304__384.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10732 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_084603__748 | 0 | 0.0 | 11.7022 | 0 | [417, 4] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_084603__748.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10733 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_131547__743 | 0 | 0.0 | 108.162 | 0 | [417, 570] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_131547__743.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10734 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_131830__318 | 0 | 0.0 | 12.1704 | 0 | [417, 5] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_131830__318.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10735 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_084551__827 | 0 | 0.0 | 60.4446 | 0 | [415, 297] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_084551__827.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10736 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_131129__941 | 0 | 0.0 | 119.689 | 3 | [415, 639] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_131129__941.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10737 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_131254__683 | 0 | 0.0 | 84.9412 | 0 | [415, 440] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_131254__683.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10738 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_131359__723 | 0 | 0.0 | 64.4689 | 0 | [415, 321] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_131359__723.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10739 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_113416__876 | 0 | 0.0 | 10.1174 | 0 | [104, 247] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231225_113416__876.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10740 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_113421__388 | 0 | 0.0 | 4.68596 | 0 | [104, 106] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231225_113421__388.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10741 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231225_113346__103 | 0 | 0.0 | 16.2312 | 0 | [107, 404] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_113346__103.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10742 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_113406__582 | 0 | 0.0 | 19.6671 | 1 | [107, 491] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_113406__582.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10743 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_083546__833 | 0 | 0.0 | 13.3685 | 0 | [107, 329] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231227_083546__833.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10744 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_113322__743 | 0 | 0.0 | 11.2448 | 0 | [148, 271] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_113322__743.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10745 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_113329__729 | 0 | 0.0 | 7.54025 | 1 | [148, 176] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_113329__729.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10746 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_083532__438 | 0 | 0.0 | 9.45103 | 0 | [148, 224] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_083532__438.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10747 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_113253__733 | 0 | 0.0 | 17.9369 | 0 | [249, 259] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_113253__733.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10748 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_113311__600 | 0 | 0.0 | 17.3532 | 2 | [249, 409] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_113311__600.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10749 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_083523__907 | 0 | 0.0 | 16.7413 | 0 | [249, 243] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_083523__907.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10750 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_113517__443 | 0 | 0.0 | 18.0091 | 2 | [415, 397] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_113517__443.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10751 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_113528__112 | 0 | 0.0 | 10.9723 | 2 | [415, 223] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_113528__112.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10752 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_083621__183 | 0 | 0.0 | 17.6033 | 2 | [415, 385] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_083621__183.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10753 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_113441__474 | 0 | 0.0 | 20.6363 | 3 | [413, 462] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_113441__474.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10754 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_113459__108 | 0 | 0.0 | 17.7118 | 0 | [413, 390] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_113459__108.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10755 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_083603__578 | 0 | 0.0 | 17.4645 | 2 | [413, 382] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_083603__578.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10756 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231214_082710__362 | 0 | 0.0 | 13.5013 | 0 | [82, 397] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231214_082710__362.json | 0.0 | missing | missing | missing | |
| 10757 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231225_094112__935 | 0 | 0.0 | 6.3483 | 0 | [102, 196] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231225_094112__935.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10758 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231225_094121__816 | 0 | 0.0 | 9.22851 | 0 | [102, 293] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231225_094121__816.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10759 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231214_082657__300 | 0 | 0.0 | 17.4722 | 0 | [99, 504] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231214_082657__300.json | 50.0 | missing | missing | missing | |
| 10760 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231225_094055__728 | 0 | 0.0 | 8.92261 | 0 | [105, 282] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_094055__728.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10761 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | InJulia | 1SHOT | true | false | 5 | 20231225_094106__106 | 0 | 0.0 | 10.3215 | 0 | [105, 329] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_094106__106.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10762 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231227_022255__498 | 0 | 0.0 | 12.4375 | 3 | [105, 395] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231227_022255__498.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10763 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231227_080715__491 | 0 | 0.0 | 17.1303 | 1 | [105, 543] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231227_080715__491.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10764 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_082639__853 | 0 | 0.0 | 19.2866 | 0 | [128, 545] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231214_082639__853.json | 0.0 | missing | missing | missing | |
| 10765 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_094035__155 | 0 | 0.0 | 10.4509 | 0 | [146, 327] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_094035__155.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10766 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_094046__807 | 0 | 0.0 | 10.3384 | 2 | [146, 324] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_094046__807.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10767 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_022242__911 | 0 | 0.0 | 8.53843 | 3 | [146, 261] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231227_022242__911.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10768 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_080658__437 | 0 | 0.0 | 14.5429 | 1 | [146, 454] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231227_080658__437.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10769 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_082620__506 | 0 | 0.0 | 19.4611 | 0 | [229, 507] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231214_082620__506.json | 50.0 | missing | missing | missing | |
| 10770 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_094010__902 | 0 | 0.0 | 19.4256 | 0 | [247, 423] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_094010__902.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10771 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_094025__746 | 0 | 0.0 | 14.32 | 3 | [247, 433] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_094025__746.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10772 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_022234__681 | 0 | 0.0 | 18.7157 | 0 | [247, 411] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231227_022234__681.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10773 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_080643__338 | 0 | 0.0 | 21.6527 | 3 | [247, 497] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231227_080643__338.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10774 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_082800__691 | 0 | 0.0 | 20.9763 | 0 | [11, 562] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231214_082800__691.json | 25.0 | missing | missing | missing | |
| 10775 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_094159__797 | 0 | 0.0 | 13.5181 | 3 | [413, 375] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_094159__797.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10776 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_094213__417 | 0 | 0.0 | 13.8042 | 1 | [413, 383] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_094213__417.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10777 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_022319__688 | 0 | 0.0 | 12.6155 | 0 | [413, 344] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231227_022319__688.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10778 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_080741__258 | 0 | 0.0 | 10.9003 | 3 | [413, 290] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231227_080741__258.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10779 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_082739__902 | 0 | 0.0 | 28.7807 | 0 | [399, 668] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231214_082739__902.json | 25.0 | missing | missing | missing | |
| 10780 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_094132__927 | 0 | 0.0 | 11.1501 | 0 | [411, 300] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_094132__927.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10781 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_094145__127 | 0 | 0.0 | 12.8671 | 0 | [411, 355] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_094145__127.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10782 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_022307__618 | 0 | 0.0 | 11.7094 | 0 | [411, 316] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231227_022307__618.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10783 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_080730__530 | 0 | 0.0 | 15.164 | 3 | [411, 422] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231227_080730__530.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10784 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231214_084018__866 | 0 | 0.0 | 14.781 | 0 | [82, 434] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__AsIs__1SHOT__20231214_084018__866.json | 0.0 | missing | missing | missing | |
| 10785 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231225_101048__575 | 0 | 0.0 | 29.3049 | 0 | [99, 528] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__AsIs__1SHOT__20231225_101048__575.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10786 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231225_101102__171 | 0 | 0.0 | 13.7479 | 0 | [99, 244] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__AsIs__1SHOT__20231225_101102__171.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10787 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231214_084003__402 | 0 | 0.0 | 17.52 | 0 | [99, 505] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__InJulia__1SHOT__20231214_084003__402.json | 50.0 | missing | missing | missing | |
| 10788 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_100953__276 | 0 | 0.0 | 22.6805 | 0 | [102, 408] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__InJulia__1SHOT__20231225_100953__276.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10789 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_101019__226 | 0 | 0.0 | 25.5595 | 0 | [102, 455] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__InJulia__1SHOT__20231225_101019__226.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10790 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | InJulia | 1SHOT | true | false | 5 | 20231227_082015__679 | 0 | 0.0 | 26.6858 | 0 | [102, 469] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__InJulia__1SHOT__20231227_082015__679.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10791 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_083945__989 | 0 | 0.0 | 7.9913 | 0 | [128, 222] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231214_083945__989.json | 50.0 | missing | missing | missing | |
| 10792 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_100858__871 | 0 | 0.0 | 28.1694 | 2 | [141, 500] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_100858__871.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10793 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_100931__392 | 0 | 0.0 | 32.6033 | 0 | [141, 578] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_100931__392.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10794 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_081948__679 | 0 | 0.0 | 18.719 | 0 | [141, 327] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231227_081948__679.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10795 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_083937__592 | 0 | 0.0 | 26.4545 | 0 | [229, 687] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231214_083937__592.json | 50.0 | missing | missing | missing | |
| 10796 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_100807__287 | 0 | 0.0 | 32.5309 | 2 | [242, 389] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_100807__287.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10797 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_100830__502 | 0 | 0.0 | 22.5545 | 2 | [242, 379] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_100830__502.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10798 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_081930__714 | 0 | 0.0 | 30.8678 | 0 | [242, 356] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231227_081930__714.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10799 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_084107__150 | 0 | 0.0 | 26.0185 | 0 | [11, 686] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231214_084107__150.json | 25.0 | missing | missing | missing | |
| 10800 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_101233__436 | 0 | 0.0 | 29.8327 | 0 | [405, 475] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_101233__436.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10801 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_101241__738 | 0 | 0.0 | 8.50061 | 0 | [405, 96] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_101241__738.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10802 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_082125__499 | 0 | 0.0 | 35.459 | 1 | [405, 550] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231227_082125__499.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10803 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_084041__581 | 0 | 0.0 | 22.9724 | 0 | [399, 524] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231214_084041__581.json | 0.0 | missing | missing | missing | |
| 10804 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_101127__979 | 0 | 0.0 | 25.0731 | 0 | [402, 393] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_101127__979.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10805 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_101203__739 | 0 | 0.0 | 35.8229 | 0 | [402, 578] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_101203__739.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10806 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_082049__634 | 0 | 0.0 | 34.0951 | 0 | [402, 540] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231227_082049__634.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10807 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231225_113748__365 | 0 | 0.0 | 18.5863 | 0 | [93, 702] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231225_113748__365.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10808 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231225_113811__990 | 0 | 0.0 | 22.3888 | 0 | [93, 835] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231225_113811__990.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10809 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_113705__591 | 0 | 0.0 | 28.7492 | 0 | [96, 1049] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_113705__591.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10810 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_113730__418 | 0 | 0.0 | 24.6802 | 0 | [96, 913] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_113730__418.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10811 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_083714__755 | 0 | 0.0 | 32.5532 | 0 | [96, 1165] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231227_083714__755.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10812 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_113617__377 | 0 | 0.0 | 42.2106 | 0 | [133, 1463] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_113617__377.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10813 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_113636__356 | 0 | 0.0 | 18.9812 | 0 | [133, 704] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_113636__356.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10814 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_083642__664 | 0 | 0.0 | 15.8399 | 0 | [133, 588] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_083642__664.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10815 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_113534__122 | 0 | 0.0 | 5.19615 | 0 | [232, 31] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_113534__122.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10816 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_113535__493 | 0 | 0.0 | 1.19206 | 0 | [232, 19] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_113535__493.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10817 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_083626__113 | 0 | 0.0 | 4.56707 | 0 | [232, 14] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_083626__113.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10818 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_113834__676 | 0 | 0.0 | 7.57878 | 0 | [385, 237] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_113834__676.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10819 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_113842__406 | 0 | 0.0 | 8.43451 | 0 | [385, 268] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_113842__406.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10820 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_083730__702 | 0 | 0.0 | 7.77404 | 0 | [385, 242] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_083730__702.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10821 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_113821__889 | 0 | 0.0 | 10.3604 | 0 | [382, 342] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_113821__889.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10822 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_113826__911 | 0 | 0.0 | 5.17462 | 0 | [382, 151] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_113826__911.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10823 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_083722__621 | 0 | 0.0 | 8.22105 | 0 | [382, 262] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_083722__621.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10824 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231214_084156__302 | 0 | 0.0 | 12.638 | 0 | [82, 372] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231214_084156__302.json | 0.0 | missing | missing | missing | |
| 10825 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231225_101757__134 | 0 | 0.0 | 56.2604 | 0 | [107, 436] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_101757__134.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10826 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231225_101836__354 | 0 | 0.0 | 38.8523 | 0 | [107, 297] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_101836__354.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10827 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | InJulia | 1SHOT | false | false | 5 | 20231214_084144__124 | 0 | 0.0 | 15.371 | 0 | [99, 446] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231214_084144__124.json | 0.0 | missing | missing | missing | |
| 10828 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_101612__393 | 0 | 0.0 | 35.5435 | 3 | [110, 270] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_101612__393.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10829 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | InJulia | 1SHOT | false | false | 5 | 20231225_101701__565 | 0 | 0.0 | 48.2258 | 0 | [110, 372] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_101701__565.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10830 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231227_082353__821 | 0 | 0.0 | 49.6965 | 0 | [110, 379] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231227_082353__821.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10831 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_084128__103 | 0 | 0.0 | 2.41673 | 0 | [128, 50] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231214_084128__103.json | 50.0 | missing | missing | missing | |
| 10832 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_101500__548 | 0 | 0.0 | 49.6837 | 3 | [149, 377] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_101500__548.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10833 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_101537__317 | 0 | 0.0 | 36.1898 | 2 | [149, 269] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_101537__317.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10834 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_082304__453 | 0 | 0.0 | 46.7901 | 2 | [149, 349] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231227_082304__453.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10835 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_084126__382 | 0 | 0.0 | 18.8991 | 0 | [229, 492] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_084126__382.json | 0.0 | missing | missing | missing | |
| 10836 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_101328__719 | 0 | 0.0 | 46.4892 | 2 | [250, 157] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_101328__719.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10837 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_101410__995 | 0 | 0.0 | 42.4773 | 2 | [250, 301] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_101410__995.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10838 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_082217__716 | 0 | 0.0 | 51.8384 | 2 | [250, 205] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_082217__716.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10839 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_084242__789 | 0 | 0.0 | 17.5778 | 0 | [11, 476] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_084242__789.json | 25.0 | missing | missing | missing | |
| 10840 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_102048__488 | 0 | 0.0 | 59.096 | 3 | [413, 397] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_102048__488.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10841 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_102139__344 | 0 | 0.0 | 51.5749 | 1 | [413, 339] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_102139__344.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10842 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_082602__807 | 0 | 0.0 | 81.0386 | 3 | [413, 542] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_082602__807.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10843 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_084224__867 | 0 | 0.0 | 27.9523 | 0 | [399, 648] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231214_084224__867.json | 25.0 | missing | missing | missing | |
| 10844 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_101921__612 | 0 | 0.0 | 45.2905 | 3 | [410, 290] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_101921__612.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10845 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_101948__446 | 0 | 0.0 | 27.2924 | 2 | [410, 149] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_101948__446.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10846 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_082441__220 | 0 | 0.0 | 47.4156 | 3 | [410, 285] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231227_082441__220.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10847 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_112824__236 | 0 | 0.0 | 23.5405 | 0 | [104, 395] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231225_112824__236.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10848 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_112850__734 | 0 | 0.0 | 25.3522 | 0 | [104, 426] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231225_112850__734.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10849 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_112736__623 | 0 | 0.0 | 21.5736 | 0 | [107, 361] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_112736__623.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10850 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_112801__275 | 0 | 0.0 | 24.6148 | 0 | [107, 413] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_112801__275.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10851 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_083310__440 | 0 | 0.0 | 23.0073 | 0 | [107, 384] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231227_083310__440.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10852 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_112653__985 | 0 | 0.0 | 18.9684 | 0 | [148, 311] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_112653__985.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10853 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_112714__713 | 0 | 0.0 | 20.6183 | 0 | [148, 339] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_112714__713.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10854 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_083247__869 | 0 | 0.0 | 16.9588 | 0 | [148, 275] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_083247__869.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10855 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_112613__745 | 0 | 0.0 | 37.251 | 0 | [249, 451] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_112613__745.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10856 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_112634__995 | 0 | 0.0 | 21.4288 | 0 | [249, 336] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_112634__995.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10857 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_083230__495 | 0 | 0.0 | 30.9054 | 0 | [249, 344] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_083230__495.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10858 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_112946__700 | 0 | 0.0 | 15.3239 | 0 | [415, 207] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_112946__700.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10859 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_113011__238 | 0 | 0.0 | 25.609 | 0 | [415, 379] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_113011__238.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10860 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_083406__933 | 0 | 0.0 | 19.3639 | 0 | [415, 274] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_083406__933.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10861 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_112911__706 | 0 | 0.0 | 21.3917 | 0 | [413, 309] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_112911__706.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10862 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_112930__789 | 0 | 0.0 | 18.8196 | 1 | [413, 266] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_112930__789.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10863 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_083347__735 | 0 | 0.0 | 36.9014 | 2 | [413, 561] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_083347__735.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10864 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231214_083830__504 | 0 | 0.0 | 11.5204 | 0 | [82, 339] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__AsIs__1SHOT__20231214_083830__504.json | 0.0 | missing | missing | missing | |
| 10865 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231225_100656__373 | 0 | 0.0 | 3.90638 | 0 | [105, 216] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_100656__373.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10866 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231225_100700__102 | 0 | 0.0 | 4.7865 | 0 | [105, 268] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_100700__102.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10867 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231214_083818__895 | 0 | 0.0 | 15.3515 | 0 | [99, 445] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__InJulia__1SHOT__20231214_083818__895.json | 0.0 | missing | missing | missing | |
| 10868 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | InJulia | 1SHOT | true | true | 5 | 20231225_100647__394 | 0 | 0.0 | 6.46986 | 0 | [108, 362] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_100647__394.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10869 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231225_100652__864 | 0 | 0.0 | 5.05451 | 0 | [108, 281] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_100652__864.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10870 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231227_081839__591 | 0 | 0.0 | 6.98425 | 0 | [108, 387] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__InJulia__1SHOT__20231227_081839__591.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10871 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_083803__252 | 0 | 0.0 | 9.00863 | 0 | [128, 253] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231214_083803__252.json | 50.0 | missing | missing | missing | |
| 10872 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_100631__795 | 0 | 0.0 | 5.04334 | 0 | [145, 273] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_100631__795.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10873 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_100640__647 | 0 | 0.0 | 9.25819 | 0 | [145, 502] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_100640__647.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10874 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_081832__860 | 0 | 0.0 | 9.34676 | 0 | [145, 502] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231227_081832__860.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10875 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_083754__440 | 0 | 0.0 | 19.8381 | 0 | [229, 517] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231214_083754__440.json | 25.0 | missing | missing | missing | |
| 10876 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_100612__365 | 0 | 0.0 | 10.3598 | 0 | [241, 384] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_100612__365.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10877 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_100626__997 | 0 | 0.0 | 13.485 | 0 | [241, 686] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_100626__997.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10878 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_081823__219 | 0 | 0.0 | 14.0333 | 0 | [241, 575] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231227_081823__219.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10879 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_083911__300 | 0 | 0.0 | 20.768 | 0 | [11, 557] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231214_083911__300.json | 0.0 | missing | missing | missing | |
| 10880 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_100727__544 | 0 | 0.0 | 5.88515 | 0 | [395, 256] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_100727__544.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10881 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_100734__217 | 0 | 0.0 | 7.21581 | 0 | [395, 325] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_100734__217.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10882 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_081859__655 | 0 | 0.0 | 8.80301 | 0 | [395, 399] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231227_081859__655.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10883 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_083850__174 | 0 | 0.0 | 20.0725 | 0 | [399, 450] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231214_083850__174.json | 50.0 | missing | missing | missing | |
| 10884 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_100708__699 | 0 | 0.0 | 7.53742 | 0 | [393, 339] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_100708__699.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10885 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_100721__178 | 0 | 0.0 | 13.1845 | 0 | [393, 618] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_100721__178.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10886 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_081850__691 | 0 | 0.0 | 10.9537 | 0 | [393, 502] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231227_081850__691.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10887 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231214_082907__250 | 0 | 0.0 | 20.7762 | 0 | [82, 599] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__AsIs__1SHOT__20231214_082907__250.json | 0.0 | missing | missing | missing | |
| 10888 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231225_094350__643 | 0 | 0.0 | 13.7477 | 0 | [104, 440] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__AsIs__1SHOT__20231225_094350__643.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10889 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231225_094405__657 | 0 | 0.0 | 14.604 | 0 | [104, 468] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__AsIs__1SHOT__20231225_094405__657.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10890 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | InJulia | 1SHOT | false | false | 5 | 20231214_082846__848 | 0 | 0.0 | 18.5378 | 0 | [99, 533] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__InJulia__1SHOT__20231214_082846__848.json | 0.0 | missing | missing | missing | |
| 10891 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_094325__456 | 0 | 0.0 | 11.3287 | 0 | [107, 360] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_094325__456.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10892 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_094336__967 | 0 | 0.0 | 10.7859 | 0 | [107, 344] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_094336__967.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10893 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231227_022356__623 | 0 | 0.0 | 9.9457 | 0 | [107, 313] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__InJulia__1SHOT__20231227_022356__623.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10894 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231227_080827__403 | 0 | 0.0 | 13.054 | 1 | [107, 413] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__InJulia__1SHOT__20231227_080827__403.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10895 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_082827__702 | 0 | 0.0 | 11.2828 | 0 | [128, 320] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231214_082827__702.json | 50.0 | missing | missing | missing | |
| 10896 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_094258__567 | 0 | 0.0 | 12.0129 | 1 | [148, 378] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_094258__567.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10897 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_094314__765 | 0 | 0.0 | 15.459 | 2 | [148, 488] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_094314__765.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 10898 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_022346__172 | 0 | 0.0 | 11.0361 | 0 | [148, 342] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231227_022346__172.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10899 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_080814__345 | 0 | 0.0 | 11.6294 | 3 | [148, 360] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231227_080814__345.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10900 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_082816__353 | 0 | 0.0 | 16.0357 | 0 | [229, 414] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231214_082816__353.json | 0.0 | missing | missing | missing | |
| 10901 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_094233__871 | 0 | 0.0 | 19.8831 | 0 | [249, 430] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_094233__871.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10902 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_094246__992 | 0 | 0.0 | 12.7786 | 0 | [249, 382] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_094246__992.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10903 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_022334__913 | 0 | 0.0 | 14.9509 | 3 | [249, 284] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231227_022334__913.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10904 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_080802__111 | 0 | 0.0 | 20.7585 | 0 | [249, 468] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231227_080802__111.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10905 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_083012__871 | 0 | 0.0 | 31.1516 | 0 | [11, 806] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231214_083012__871.json | 25.0 | missing | missing | missing | |
| 10906 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_094442__644 | 0 | 0.0 | 9.60781 | 0 | [415, 251] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_094442__644.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10907 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_094458__750 | 0 | 0.0 | 15.6798 | 0 | [415, 441] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_094458__750.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10908 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_022422__314 | 0 | 0.0 | 11.3691 | 1 | [415, 304] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231227_022422__314.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10909 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_080902__149 | 0 | 0.0 | 20.7269 | 1 | [415, 589] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231227_080902__149.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 10910 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_082941__707 | 0 | 0.0 | 33.6933 | 0 | [399, 784] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231214_082941__707.json | 50.0 | missing | missing | missing | |
| 10911 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_094416__371 | 0 | 0.0 | 10.7378 | 3 | [413, 286] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_094416__371.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10912 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_094433__315 | 0 | 0.0 | 16.9908 | 3 | [413, 481] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_094433__315.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10913 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_022410__203 | 0 | 0.0 | 14.1299 | 3 | [413, 390] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231227_022410__203.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10914 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_080841__559 | 0 | 0.0 | 13.5146 | 0 | [413, 370] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231227_080841__559.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10915 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231214_083112__515 | 0 | 0.0 | 22.9563 | 0 | [82, 657] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__AsIs__1SHOT__20231214_083112__515.json | 0.0 | missing | missing | missing | |
| 10916 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231225_095057__837 | 0 | 0.0 | 56.1354 | 0 | [101, 416] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__AsIs__1SHOT__20231225_095057__837.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10917 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231225_095211__474 | 0 | 0.0 | 73.6444 | 0 | [101, 546] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__AsIs__1SHOT__20231225_095211__474.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10918 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231214_083049__308 | 0 | 0.0 | 10.0355 | 0 | [99, 290] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__InJulia__1SHOT__20231214_083049__308.json | 50.0 | missing | missing | missing | |
| 10919 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | InJulia | 1SHOT | false | false | 5 | 20231225_094908__194 | 0 | 0.0 | 80.5029 | 0 | [104, 596] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_094908__194.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10920 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231225_095001__774 | 0 | 0.0 | 52.8473 | 0 | [104, 390] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_095001__774.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10921 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231227_022747__432 | 0 | 0.0 | 77.8554 | 0 | [104, 580] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__InJulia__1SHOT__20231227_022747__432.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10922 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231227_081214__197 | 0 | 0.0 | 77.0827 | 0 | [104, 574] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__InJulia__1SHOT__20231227_081214__197.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10923 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_083039__373 | 0 | 0.0 | 10.931 | 1 | [128, 309] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231214_083039__373.json | 58.3333 | missing | missing | missing | |
| 10924 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_094701__305 | 0 | 0.0 | 62.91 | 0 | [143, 451] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_094701__305.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10925 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_094747__465 | 0 | 0.0 | 45.6226 | 0 | [143, 329] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_094747__465.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10926 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_022629__983 | 0 | 0.0 | 63.9752 | 0 | [143, 470] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231227_022629__983.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10927 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_081057__677 | 0 | 0.0 | 53.4235 | 0 | [143, 390] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231227_081057__677.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10928 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_083028__689 | 0 | 0.0 | 15.687 | 0 | [229, 405] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231214_083028__689.json | 0.0 | missing | missing | missing | |
| 10929 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_094543__377 | 0 | 0.0 | 44.4544 | 0 | [244, 121] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_094543__377.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10930 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_094559__244 | 0 | 0.0 | 15.5151 | 0 | [244, 81] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_094559__244.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10931 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_022525__287 | 0 | 0.0 | 63.5399 | 0 | [244, 278] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231227_022525__287.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10932 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_081002__348 | 0 | 0.0 | 60.4046 | 3 | [244, 254] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231227_081002__348.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10933 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_083213__629 | 0 | 0.0 | 20.4187 | 0 | [11, 548] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231214_083213__629.json | 50.0 | missing | missing | missing | |
| 10934 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_095549__902 | 0 | 0.0 | 79.9459 | 0 | [417, 517] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_095549__902.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10935 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_095721__464 | 0 | 0.0 | 92.412 | 0 | [417, 604] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_095721__464.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10936 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_022955__439 | 0 | 0.0 | 64.3869 | 0 | [417, 414] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231227_022955__439.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10937 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_083152__848 | 0 | 0.0 | 40.1486 | 0 | [399, 933] | 0.10.0-DEV | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231214_083152__848.json | 25.0 | missing | missing | missing | |
| 10938 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_095320__947 | 0 | 0.0 | 69.7206 | 3 | [415, 451] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_095320__947.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10939 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_095429__343 | 0 | 0.0 | 68.4094 | 0 | [415, 435] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_095429__343.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10940 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_022851__900 | 0 | 0.0 | 63.7192 | 0 | [415, 413] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231227_022851__900.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10941 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_081310__448 | 0 | 0.0 | 56.1029 | 0 | [415, 357] | 0.10.0-DEV | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231227_081310__448.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10942 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | AsIs | 1SHOT | false | false | 5 | 20231214_085014__671 | 0 | 0.0 | 13.0216 | 0 | [57, 392] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231214_085014__671.json | 0.0 | missing | missing | missing | |
| 10943 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | AsIs | 1SHOT | true | true | 5 | 20231225_085754__859 | 5 | 0.0 | 14.6846 | 5 | [79, 268] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_085754__859.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10944 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | AsIs | 1SHOT | true | true | 5 | 20231225_085814__592 | 5 | 0.0 | 20.0331 | 5 | [79, 369] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_085814__592.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10945 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231214_085001__207 | 1 | 0.0 | 9.77953 | 0 | [74, 291] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231214_085001__207.json | 55.0 | missing | missing | missing | |
| 10946 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_085720__167 | 5 | 0.0 | 15.1671 | 5 | [82, 277] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_085720__167.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10947 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_085739__613 | 5 | 0.0 | 18.8354 | 5 | [82, 346] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_085739__613.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10948 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231227_085640__771 | 5 | 0.0 | 20.1605 | 5 | [82, 368] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231227_085640__771.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10949 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_084951__804 | 1 | 0.0 | 7.61435 | 0 | [103, 215] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231214_084951__804.json | 55.0 | missing | missing | missing | |
| 10950 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_085703__124 | 0 | 0.0 | 4.53612 | 0 | [120, 68] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_085703__124.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10951 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_085705__672 | 0 | 0.0 | 2.65283 | 0 | [120, 31] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_085705__672.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10952 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_085619__746 | 5 | 0.0 | 10.9073 | 5 | [120, 189] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231227_085619__746.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10953 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_084943__314 | 1 | 0.0 | 15.1608 | 0 | [201, 401] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231214_084943__314.json | 55.0 | missing | missing | missing | |
| 10954 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_085648__275 | 5 | 0.0 | 25.9908 | 5 | [219, 269] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_085648__275.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10955 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_085658__813 | 0 | 0.0 | 9.72037 | 0 | [219, 151] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_085658__813.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10956 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_085608__905 | 5 | 0.0 | 19.5126 | 5 | [219, 158] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231227_085608__905.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10957 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_085050__679 | 1 | 0.0 | 11.7442 | 0 | [11, 326] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231214_085050__679.json | 55.0 | missing | missing | missing | |
| 10958 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_085920__566 | 5 | 0.0 | 25.6768 | 5 | [385, 407] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_085920__566.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10959 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_085932__529 | 5 | 0.0 | 11.9565 | 5 | [385, 160] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_085932__529.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10960 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_085730__484 | 1 | 0.0 | 38.0886 | 0 | [385, 616] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231227_085730__484.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10961 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_085038__605 | 0 | 0.0 | 24.0292 | 0 | [374, 564] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231214_085038__605.json | 25.0 | missing | missing | missing | |
| 10962 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_085836__934 | 5 | 0.0 | 22.2061 | 5 | [382, 350] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_085836__934.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10963 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_085855__137 | 1 | 0.0 | 18.1823 | 0 | [382, 278] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_085855__137.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10964 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_085652__984 | 5 | 0.0 | 12.4699 | 5 | [382, 172] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231227_085652__984.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10965 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_050428__458 | 5 | 0.0 | 3.4595 | 5 | [0, 262] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_050428__458.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10966 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_050435__659 | 5 | 0.0 | 6.74946 | 5 | [0, 507] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_050435__659.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10967 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_050438__522 | 5 | 0.0 | 2.64943 | 5 | [0, 196] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_050438__522.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10968 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_050441__823 | 5 | 0.0 | 3.436 | 5 | [0, 254] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_050441__823.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10969 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_050447__104 | 5 | 0.0 | 5.3222 | 5 | [0, 397] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_050447__104.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10970 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_050400__532 | 4 | 0.0 | 0.937026 | 5 | [0, 71] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_050400__532.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10971 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_050401__854 | 5 | 0.0 | 0.536413 | 5 | [0, 40] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_050401__854.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10972 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_050402__350 | 5 | 0.0 | 1.06351 | 5 | [0, 79] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_050402__350.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10973 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_050405__647 | 1 | 0.0 | 3.07383 | 0 | [0, 231] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_050405__647.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10974 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_050406__240 | 5 | 0.0 | 0.995397 | 5 | [0, 75] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_050406__240.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10975 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_050341__754 | 5 | 0.0 | 3.37526 | 5 | [0, 251] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_050341__754.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10976 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_050345__810 | 5 | 0.0 | 4.08955 | 5 | [0, 303] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_050345__810.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10977 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_050349__277 | 0 | 0.0 | 0.465852 | 0 | [0, 35] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_050349__277.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10978 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_050349__531 | 5 | 0.0 | 3.1631 | 5 | [0, 237] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_050349__531.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10979 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_050350__408 | 0 | 0.0 | 0.468598 | 0 | [0, 35] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_050350__408.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10980 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_050559__977 | 5 | 0.0 | 7.90714 | 5 | [0, 558] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_050559__977.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10981 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_050601__571 | 5 | 0.0 | 2.43768 | 5 | [0, 175] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_050601__571.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10982 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_050606__903 | 5 | 0.0 | 5.11412 | 5 | [0, 367] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_050606__903.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10983 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_050610__854 | 5 | 0.0 | 4.1564 | 5 | [0, 298] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_050610__854.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10984 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_050611__568 | 0 | 0.0 | 0.852879 | 0 | [0, 61] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_050611__568.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10985 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_050511__699 | 5 | 0.0 | 3.1904 | 5 | [0, 234] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_050511__699.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10986 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_050522__866 | 1 | 0.0 | 11.2177 | 0 | [0, 805] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_050522__866.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10987 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_050526__272 | 5 | 0.0 | 3.86809 | 5 | [0, 283] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_050526__272.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10988 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_050533__922 | 1 | 0.0 | 6.85176 | 0 | [0, 497] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_050533__922.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10989 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_050537__895 | 5 | 0.0 | 3.44524 | 5 | [0, 253] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_050537__895.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10990 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | AsIs | 1SHOT | false | false | 5 | 20231214_085138__334 | 0 | 0.0 | 12.1226 | 0 | [57, 365] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__AsIs__1SHOT__20231214_085138__334.json | 0.0 | missing | missing | missing | |
| 10991 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | AsIs | 1SHOT | true | true | 5 | 20231225_090025__512 | 5 | 0.0 | 3.45964 | 5 | [53, 57] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__AsIs__1SHOT__20231225_090025__512.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10992 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | AsIs | 1SHOT | true | false | 5 | 20231225_090035__567 | 0 | 0.0 | 9.67723 | 0 | [53, 178] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__AsIs__1SHOT__20231225_090035__567.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10993 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | InJulia | 1SHOT | true | true | 5 | 20231214_085126__557 | 1 | 0.0 | 13.6638 | 0 | [74, 397] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__InJulia__1SHOT__20231214_085126__557.json | 55.0 | missing | missing | missing | |
| 10994 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | InJulia | 1SHOT | true | false | 5 | 20231225_090019__184 | 0 | 0.0 | 19.9242 | 0 | [56, 372] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_090019__184.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10995 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_090022__817 | 0 | 0.0 | 3.08442 | 0 | [56, 50] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_090022__817.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10996 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_085111__987 | 1 | 0.0 | 6.76224 | 0 | [103, 190] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231214_085111__987.json | 55.0 | missing | missing | missing | |
| 10997 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_085954__129 | 0 | 0.0 | 6.72709 | 0 | [57, 121] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_085954__129.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10998 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_085959__406 | 0 | 0.0 | 4.73738 | 0 | [57, 82] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_085959__406.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 10999 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_085104__138 | 1 | 0.0 | 14.4655 | 0 | [201, 382] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231214_085104__138.json | 55.0 | missing | missing | missing | |
| 11000 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_085945__318 | 0 | 0.0 | 12.3463 | 0 | [94, 30] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_085945__318.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11001 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_085947__767 | 0 | 0.0 | 2.31938 | 0 | [94, 29] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_085947__767.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11002 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_085232__431 | 1 | 0.0 | 25.9812 | 0 | [11, 686] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231214_085232__431.json | 55.0 | missing | missing | missing | |
| 11003 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_090047__474 | 0 | 0.0 | 1.21617 | 0 | [74, 8] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_090047__474.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11004 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_085206__850 | 1 | 0.0 | 27.4067 | 0 | [374, 641] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231214_085206__850.json | 55.0 | missing | missing | missing | |
| 11005 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_090038__599 | 1 | 0.0 | 3.46579 | 0 | [71, 52] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_090038__599.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11006 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_090046__218 | 0 | 0.0 | 7.28874 | 0 | [71, 126] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_090046__218.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11007 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_051053__355 | 5 | 0.0 | 6.54395 | 5 | [0, 234] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_051053__355.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11008 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_051100__261 | 5 | 0.0 | 7.34708 | 5 | [0, 262] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_051100__261.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11009 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_051111__850 | 3 | 0.0 | 10.5321 | 4 | [0, 375] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_051111__850.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11010 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_051117__142 | 5 | 0.0 | 6.37 | 5 | [0, 230] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_051117__142.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11011 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_051126__563 | 5 | 0.0 | 8.72722 | 5 | [0, 314] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_051126__563.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11012 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_050937__819 | 5 | 0.0 | 2.68762 | 5 | [0, 97] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_050937__819.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11013 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_050942__883 | 5 | 0.0 | 5.10756 | 5 | [0, 184] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_050942__883.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11014 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_050943__444 | 5 | 0.0 | 1.30362 | 5 | [0, 47] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_050943__444.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11015 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_050946__612 | 5 | 0.0 | 2.93639 | 5 | [0, 106] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_050946__612.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11016 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_050956__713 | 5 | 0.0 | 10.3191 | 5 | [0, 371] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_050956__713.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11017 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_050847__693 | 5 | 0.0 | 6.38476 | 5 | [0, 226] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_050847__693.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11018 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_050853__263 | 5 | 0.0 | 6.37915 | 5 | [0, 227] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_050853__263.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11019 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_050903__615 | 5 | 0.0 | 9.2406 | 5 | [0, 328] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_050903__615.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11020 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_050914__590 | 5 | 0.0 | 10.8814 | 5 | [0, 385] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_050914__590.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11021 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_050916__880 | 0 | 0.0 | 1.86096 | 0 | [0, 66] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_050916__880.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11022 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_051314__751 | 0 | 0.0 | 1.97389 | 0 | [0, 69] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_051314__751.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11023 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_051316__871 | 0 | 0.0 | 1.87955 | 0 | [0, 67] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_051316__871.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11024 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_051318__363 | 0 | 0.0 | 1.68387 | 0 | [0, 60] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_051318__363.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11025 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_051325__984 | 5 | 0.0 | 7.76245 | 5 | [0, 275] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_051325__984.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11026 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_051332__107 | 0 | 0.0 | 6.39632 | 0 | [0, 227] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_051332__107.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11027 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_051138__809 | 0 | 0.0 | 0.142793 | 0 | [0, 5] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_051138__809.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11028 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_051138__863 | 0 | 0.0 | 1.67716 | 0 | [0, 60] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_051138__863.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11029 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_051138__942 | 0 | 0.0 | 0.142688 | 0 | [0, 5] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_051138__942.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11030 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_051139__105 | 0 | 0.0 | 0.142829 | 0 | [0, 5] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_051139__105.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11031 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_051141__442 | 0 | 0.0 | 2.20745 | 0 | [0, 79] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_051141__442.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11032 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 5 | 20240201_045952__120 | 5 | 0.0 | 6.59737 | 5 | [0, 161] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_045952__120.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11033 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240201_045954__821 | 0 | 0.0 | 2.49251 | 0 | [0, 61] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_045954__821.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11034 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240201_050000__279 | 0 | 0.0 | 6.17068 | 0 | [0, 150] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_050000__279.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11035 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 5 | 20240201_050006__464 | 0 | 0.0 | 0.569122 | 0 | [0, 14] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_050006__464.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11036 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 5 | 20240201_050006__978 | 1 | 0.0 | 5.18364 | 0 | [0, 128] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_050006__978.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11037 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_045732__356 | 0 | 0.0 | 7.58911 | 0 | [0, 187] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_045732__356.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11038 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_045752__100 | 1 | 0.0 | 19.614 | 5 | [0, 478] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_045752__100.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11039 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_045802__920 | 1 | 0.0 | 10.5228 | 0 | [0, 258] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_045802__920.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11040 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_045813__887 | 4 | 0.0 | 11.0886 | 5 | [0, 272] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_045813__887.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11041 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_045825__964 | 1 | 0.0 | 11.7705 | 0 | [0, 289] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_045825__964.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11042 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_045543__916 | 1 | 0.0 | 6.84869 | 0 | [0, 167] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_045543__916.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11043 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_045553__753 | 0 | 0.0 | 9.68512 | 0 | [0, 237] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_045553__753.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11044 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_045555__592 | 0 | 0.0 | 2.45156 | 0 | [0, 60] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_045555__592.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11045 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_045610__619 | 5 | 0.0 | 14.5322 | 5 | [0, 354] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_045610__619.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11046 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_045624__246 | 5 | 0.0 | 13.5228 | 5 | [0, 326] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_045624__246.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11047 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_050237__623 | 0 | 0.0 | 5.38347 | 0 | [0, 131] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_050237__623.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11048 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_050247__923 | 0 | 0.0 | 9.56984 | 0 | [0, 232] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_050247__923.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11049 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_050250__524 | 5 | 0.0 | 2.95265 | 5 | [0, 72] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_050250__524.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11050 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_050309__628 | 0 | 0.0 | 18.8056 | 0 | [0, 454] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_050309__628.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11051 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_050318__619 | 4 | 0.0 | 9.65953 | 5 | [0, 234] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_050318__619.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11052 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_050104__575 | 0 | 0.0 | 4.52344 | 0 | [0, 110] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_050104__575.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11053 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_050126__460 | 1 | 0.0 | 21.7241 | 5 | [0, 523] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_050126__460.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11054 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_050131__816 | 0 | 0.0 | 5.10306 | 0 | [0, 124] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_050131__816.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11055 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_050141__417 | 5 | 0.0 | 9.63646 | 5 | [0, 233] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_050141__417.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11056 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_050150__626 | 0 | 0.0 | 9.66742 | 0 | [0, 234] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_050150__626.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11057 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_044902__450 | 0 | 0.0 | 11.2984 | 0 | [0, 211] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_044902__450.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11058 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_044905__510 | 0 | 0.0 | 3.25645 | 0 | [0, 61] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_044905__510.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11059 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20240201_044912__710 | 0 | 0.0 | 6.72256 | 0 | [0, 126] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_044912__710.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11060 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_044919__750 | 5 | 0.0 | 7.57969 | 5 | [0, 142] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_044919__750.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11061 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_044932__744 | 5 | 0.0 | 12.7053 | 5 | [0, 234] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_044932__744.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11062 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_044617__626 | 0 | 0.0 | 23.1788 | 0 | [0, 431] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_044617__626.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11063 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_044641__839 | 1 | 0.0 | 23.7541 | 0 | [0, 442] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_044641__839.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11064 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_044651__877 | 5 | 0.0 | 9.89352 | 5 | [0, 185] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_044651__877.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11065 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_044706__371 | 5 | 0.0 | 14.4583 | 5 | [0, 270] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_044706__371.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11066 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_044732__606 | 5 | 0.0 | 26.5561 | 5 | [0, 494] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_044732__606.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11067 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_044401__149 | 5 | 0.0 | 10.9767 | 5 | [0, 205] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_044401__149.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11068 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_044419__987 | 5 | 0.0 | 17.5564 | 5 | [0, 327] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_044419__987.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11069 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_044429__344 | 0 | 0.0 | 9.80422 | 0 | [0, 183] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_044429__344.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11070 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_044440__531 | 5 | 0.0 | 11.7285 | 5 | [0, 219] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_044440__531.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11071 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20240201_044455__425 | 0 | 0.0 | 14.6467 | 0 | [0, 273] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_044455__425.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11072 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_045312__773 | 0 | 0.0 | 12.1425 | 0 | [0, 224] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_045312__773.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11073 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_045320__227 | 0 | 0.0 | 7.32776 | 0 | [0, 136] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_045320__227.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11074 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_045346__418 | 0 | 0.0 | 25.9219 | 0 | [0, 476] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_045346__418.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11075 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240201_045411__850 | 0 | 0.0 | 25.5634 | 0 | [0, 468] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_045411__850.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11076 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_045427__732 | 5 | 0.0 | 15.5464 | 5 | [0, 286] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_045427__732.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11077 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_045113__417 | 5 | 0.0 | 17.5381 | 5 | [0, 321] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_045113__417.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11078 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_045121__564 | 5 | 0.0 | 8.20469 | 5 | [0, 151] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_045121__564.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11079 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_045134__872 | 0 | 0.0 | 12.8024 | 0 | [0, 237] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_045134__872.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11080 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_045142__391 | 5 | 0.0 | 8.56173 | 5 | [0, 159] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_045142__391.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11081 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_045152__876 | 1 | 0.0 | 9.80404 | 0 | [0, 182] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_045152__876.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11082 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_050646__834 | 5 | 0.0 | 1.85934 | 5 | [0, 225] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_050646__834.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11083 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_050648__176 | 5 | 0.0 | 2.01379 | 5 | [0, 239] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_050648__176.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11084 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_050650__667 | 5 | 0.0 | 1.2761 | 5 | [0, 148] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_050650__667.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11085 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_050651__328 | 1 | 0.0 | 1.9469 | 5 | [0, 223] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_050651__328.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11086 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20240201_050653__635 | 5 | 0.0 | 1.36283 | 5 | [0, 158] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_050653__635.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11087 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_050637__219 | 0 | 0.0 | 0.471718 | 0 | [0, 55] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_050637__219.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11088 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_050638__320 | 5 | 0.0 | 0.879573 | 5 | [0, 103] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_050638__320.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11089 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_050638__719 | 0 | 0.0 | 0.300863 | 0 | [0, 35] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_050638__719.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11090 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_050639__532 | 5 | 0.0 | 0.880743 | 5 | [0, 103] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_050639__532.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11091 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240201_050640__807 | 0 | 0.0 | 0.302787 | 0 | [0, 35] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_050640__807.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11092 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_050625__145 | 5 | 0.0 | 0.814569 | 5 | [0, 98] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_050625__145.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11093 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_050626__875 | 5 | 0.0 | 1.62692 | 5 | [0, 188] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_050626__875.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11094 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_050630__292 | 5 | 0.0 | 3.25278 | 5 | [0, 374] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_050630__292.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11095 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_050631__869 | 5 | 0.0 | 1.09522 | 5 | [0, 127] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_050631__869.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11096 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_050632__497 | 5 | 0.0 | 1.28332 | 5 | [0, 149] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_050632__497.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11097 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_050730__641 | 5 | 0.0 | 2.55414 | 5 | [0, 299] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_050730__641.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11098 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_050735__569 | 5 | 0.0 | 4.73182 | 5 | [0, 549] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_050735__569.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11099 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_050739__360 | 5 | 0.0 | 4.32781 | 5 | [0, 503] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_050739__360.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11100 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_050743__741 | 2 | 0.0 | 3.41249 | 1 | [0, 397] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_050743__741.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11101 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_050749__298 | 1 | 0.0 | 6.00551 | 5 | [0, 691] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_050749__298.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11102 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_050709__896 | 5 | 0.0 | 3.2339 | 5 | [0, 354] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_050709__896.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11103 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_050712__981 | 5 | 0.0 | 2.54346 | 5 | [0, 277] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_050712__981.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11104 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_050715__184 | 0 | 0.0 | 0.292189 | 0 | [0, 32] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_050715__184.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11105 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20240201_050715__698 | 0 | 0.0 | 0.291833 | 0 | [0, 32] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_050715__698.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11106 | NVIDIA-RTX-4090-4x | timezone_bumper | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_050715__914 | 5 | 0.0 | 2.73034 | 5 | [0, 301] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_050715__914.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11107 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | true | true | 5 | 20231225_092118__922 | 5 | 0.0 | 51.0444 | 5 | [70, 309] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_092118__922.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11108 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | true | true | 5 | 20231225_092156__780 | 5 | 0.0 | 37.7153 | 5 | [70, 225] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_092156__780.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11109 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_091943__508 | 5 | 0.0 | 39.4811 | 5 | [73, 237] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_091943__508.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11110 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_092027__565 | 5 | 0.0 | 43.6732 | 5 | [73, 263] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_092027__565.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11111 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_090521__838 | 5 | 0.0 | 26.8841 | 5 | [73, 157] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_090521__838.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11112 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_091833__850 | 5 | 0.0 | 38.651 | 5 | [114, 226] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_091833__850.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11113 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_091903__120 | 5 | 0.0 | 30.0392 | 5 | [114, 172] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_091903__120.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11114 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_090454__289 | 5 | 0.0 | 31.2548 | 5 | [114, 179] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_090454__289.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11115 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_091618__597 | 5 | 0.0 | 65.7138 | 5 | [212, 210] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_091618__597.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11116 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_091754__227 | 5 | 0.0 | 95.9367 | 5 | [212, 557] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_091754__227.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11117 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_090422__838 | 5 | 0.0 | 88.6963 | 5 | [212, 367] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_090422__838.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11118 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_092419__262 | 5 | 0.0 | 44.4627 | 5 | [402, 213] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_092419__262.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11119 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_092459__266 | 5 | 0.0 | 40.2267 | 5 | [402, 187] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_092459__266.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11120 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_090707__474 | 5 | 0.0 | 73.0678 | 5 | [402, 384] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_090707__474.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11121 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_092248__481 | 5 | 0.0 | 51.953 | 5 | [400, 258] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_092248__481.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11122 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_092334__101 | 5 | 0.0 | 46.1205 | 5 | [400, 222] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_092334__101.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11123 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_090554__211 | 5 | 0.0 | 33.2663 | 5 | [400, 144] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_090554__211.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11124 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_091356__770 | 0 | 0.0 | 9.76489 | 0 | [77, 375] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_091356__770.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11125 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_131925__426 | 0 | 0.0 | 10.0801 | 0 | [77, 388] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_131925__426.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11126 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_131934__590 | 0 | 0.0 | 8.74397 | 0 | [77, 337] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_131934__590.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11127 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_131942__173 | 0 | 0.0 | 7.39312 | 0 | [77, 285] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_131942__173.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11128 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_091346__723 | 1 | 0.0 | 7.17703 | 0 | [114, 271] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_091346__723.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11129 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_131858__380 | 0 | 0.0 | 2.94165 | 0 | [114, 105] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_131858__380.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11130 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_131907__489 | 1 | 0.0 | 9.20755 | 0 | [114, 349] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_131907__489.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11131 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_131915__847 | 0 | 0.0 | 8.0069 | 0 | [114, 303] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_131915__847.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11132 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_091339__426 | 0 | 0.0 | 8.71512 | 0 | [203, 185] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_091339__426.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11133 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_131841__119 | 0 | 0.0 | 11.9022 | 0 | [203, 299] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_131841__119.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11134 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_131846__508 | 0 | 0.0 | 4.3235 | 0 | [203, 146] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_131846__508.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11135 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_131855__545 | 0 | 0.0 | 9.21093 | 0 | [203, 332] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_131855__545.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11136 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_091415__205 | 0 | 0.0 | 8.61934 | 0 | [366, 278] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_091415__205.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11137 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_132030__951 | 0 | 0.0 | 11.2339 | 0 | [366, 373] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_132030__951.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11138 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_132048__147 | 0 | 0.0 | 17.271 | 0 | [366, 584] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_132048__147.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11139 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_132056__622 | 0 | 0.0 | 8.0301 | 0 | [366, 257] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_132056__622.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11140 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_091407__367 | 0 | 0.0 | 10.2397 | 0 | [363, 337] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_091407__367.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11141 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_131950__413 | 1 | 0.0 | 8.73254 | 0 | [363, 283] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_131950__413.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11142 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_132001__272 | 0 | 0.0 | 10.5528 | 0 | [363, 349] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_132001__272.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11143 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_132019__455 | 0 | 0.0 | 18.197 | 0 | [363, 616] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_132019__455.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11144 | Apple-MacBook-Pro-M1 | timezone_bumper | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_112338__241 | 1 | 0.0 | 2.94403 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_112338__241.json | 55.0 | missing | missing | missing | |
| 11145 | Apple-MacBook-Pro-M1 | timezone_bumper | gemini-1.0-pro-latest | InJulia | 1SHOT | true | false | 5 | 20240217_112340__181 | 0 | 0.0 | 2.714 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_112340__181.json | 25.0 | missing | missing | missing | |
| 11146 | Apple-MacBook-Pro-M1 | timezone_bumper | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_112343__220 | 1 | 0.0 | 2.54364 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_112343__220.json | 55.0 | missing | missing | missing | |
| 11147 | Apple-MacBook-Pro-M1 | timezone_bumper | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_112348__573 | 1 | 0.0 | 4.41216 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_112348__573.json | 55.0 | missing | missing | missing | |
| 11148 | Apple-MacBook-Pro-M1 | timezone_bumper | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 5 | 20240217_112350__763 | 1 | 0.0 | 2.73284 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_112350__763.json | 55.0 | missing | missing | missing | |
| 11149 | Apple-MacBook-Pro-M1 | timezone_bumper | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_112308__424 | 1 | 0.0 | 2.10296 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_112308__424.json | 55.0 | missing | missing | missing | |
| 11150 | Apple-MacBook-Pro-M1 | timezone_bumper | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_112313__637 | 1 | 0.0 | 5.07265 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_112313__637.json | 55.0 | missing | missing | missing | |
| 11151 | Apple-MacBook-Pro-M1 | timezone_bumper | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_112315__287 | 1 | 0.0 | 1.72973 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_112315__287.json | 55.0 | missing | missing | missing | |
| 11152 | Apple-MacBook-Pro-M1 | timezone_bumper | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_112319__732 | 1 | 0.0 | 3.45758 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_112319__732.json | 55.0 | missing | missing | missing | |
| 11153 | Apple-MacBook-Pro-M1 | timezone_bumper | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240217_112321__843 | 1 | 0.0 | 2.06658 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_112321__843.json | 55.0 | missing | missing | missing | |
| 11154 | Apple-MacBook-Pro-M1 | timezone_bumper | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240217_112235__796 | 1 | 0.0 | 2.87674 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_112235__796.json | 55.0 | missing | missing | missing | |
| 11155 | Apple-MacBook-Pro-M1 | timezone_bumper | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240217_112238__928 | 0 | 0.0 | 2.69709 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_112238__928.json | 25.0 | missing | missing | missing | |
| 11156 | Apple-MacBook-Pro-M1 | timezone_bumper | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240217_112240__744 | 0 | 0.0 | 2.17312 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_112240__744.json | 25.0 | missing | missing | missing | |
| 11157 | Apple-MacBook-Pro-M1 | timezone_bumper | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240217_112248__774 | 0 | 0.0 | 7.16105 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_112248__774.json | 25.0 | missing | missing | missing | |
| 11158 | Apple-MacBook-Pro-M1 | timezone_bumper | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240217_112250__942 | 0 | 0.0 | 2.21269 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_112250__942.json | 25.0 | missing | missing | missing | |
| 11159 | Apple-MacBook-Pro-M1 | timezone_bumper | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240217_112505__839 | 0 | 0.0 | 3.43958 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_112505__839.json | 25.0 | missing | missing | missing | |
| 11160 | Apple-MacBook-Pro-M1 | timezone_bumper | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240217_112513__379 | 1 | 0.0 | 7.63414 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_112513__379.json | 55.0 | missing | missing | missing | |
| 11161 | Apple-MacBook-Pro-M1 | timezone_bumper | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240217_112516__725 | 0 | 0.0 | 3.00139 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_112516__725.json | 0.0 | missing | missing | missing | |
| 11162 | Apple-MacBook-Pro-M1 | timezone_bumper | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240217_112521__599 | 1 | 0.0 | 4.83972 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_112521__599.json | 55.0 | missing | missing | missing | |
| 11163 | Apple-MacBook-Pro-M1 | timezone_bumper | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240217_112525__110 | 1 | 0.0 | 3.63029 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_112525__110.json | 55.0 | missing | missing | missing | |
| 11164 | Apple-MacBook-Pro-M1 | timezone_bumper | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_112408__447 | 1 | 0.0 | 2.48389 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_112408__447.json | 55.0 | missing | missing | missing | |
| 11165 | Apple-MacBook-Pro-M1 | timezone_bumper | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | false | 5 | 20240217_112413__186 | 0 | 0.0 | 4.39359 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_112413__186.json | 25.0 | missing | missing | missing | |
| 11166 | Apple-MacBook-Pro-M1 | timezone_bumper | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_112417__946 | 1 | 0.0 | 4.26625 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_112417__946.json | 55.0 | missing | missing | missing | |
| 11167 | Apple-MacBook-Pro-M1 | timezone_bumper | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20240217_112421__990 | 1 | 0.0 | 3.74703 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_112421__990.json | 55.0 | missing | missing | missing | |
| 11168 | Apple-MacBook-Pro-M1 | timezone_bumper | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | false | 5 | 20240217_112424__397 | 0 | 0.0 | 3.06539 | 0 | [0, 0] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_112424__397.json | 25.0 | missing | missing | missing | |
| 11169 | Apple-MacBook-Pro-M1 | timezone_bumper | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | true | 5 | 20240224_003715__925 | 1 | 0.0 | 29.6171 | 0 | [0, 327] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240224_003715__925.json | 55.0 | missing | missing | missing | |
| 11170 | Apple-MacBook-Pro-M1 | timezone_bumper | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | true | 5 | 20240224_003747__404 | 1 | 0.0 | 31.4649 | 0 | [0, 347] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240224_003747__404.json | 55.0 | missing | missing | missing | |
| 11171 | Apple-MacBook-Pro-M1 | timezone_bumper | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | true | 5 | 20240224_003752__693 | 1 | 0.0 | 5.40457 | 0 | [0, 60] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240224_003752__693.json | 55.0 | missing | missing | missing | |
| 11172 | Apple-MacBook-Pro-M1 | timezone_bumper | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | true | 5 | 20240224_003758__455 | 1 | 0.0 | 5.42113 | 0 | [0, 60] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240224_003758__455.json | 55.0 | missing | missing | missing | |
| 11173 | Apple-MacBook-Pro-M1 | timezone_bumper | gemma:7b-instruct-q6_K | InJulia | 1SHOT | true | true | 5 | 20240224_003834__134 | 1 | 0.0 | 36.8549 | 0 | [0, 405] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240224_003834__134.json | 55.0 | missing | missing | missing | |
| 11174 | Apple-MacBook-Pro-M1 | timezone_bumper | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240224_003433__336 | 0 | 0.0 | 3.70672 | 0 | [0, 55] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240224_003433__336.json | 0.0 | missing | missing | missing | |
| 11175 | Apple-MacBook-Pro-M1 | timezone_bumper | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240224_003437__534 | 0 | 0.0 | 3.89232 | 0 | [0, 61] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240224_003437__534.json | 0.0 | missing | missing | missing | |
| 11176 | Apple-MacBook-Pro-M1 | timezone_bumper | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240224_003440__442 | 0 | 0.0 | 3.88288 | 0 | [0, 61] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240224_003440__442.json | 0.0 | missing | missing | missing | |
| 11177 | Apple-MacBook-Pro-M1 | timezone_bumper | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240224_003444__507 | 0 | 0.0 | 3.66274 | 0 | [0, 58] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240224_003444__507.json | 0.0 | missing | missing | missing | |
| 11178 | Apple-MacBook-Pro-M1 | timezone_bumper | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20240224_003448__753 | 0 | 0.0 | 3.94481 | 0 | [0, 56] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240224_003448__753.json | 0.0 | missing | missing | missing | |
| 11179 | Apple-MacBook-Pro-M1 | timezone_bumper | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240224_003238__994 | 0 | 0.0 | 27.8006 | 0 | [0, 424] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240224_003238__994.json | 25.0 | missing | missing | missing | |
| 11180 | Apple-MacBook-Pro-M1 | timezone_bumper | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240224_003302__860 | 0 | 0.0 | 23.5636 | 0 | [0, 360] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240224_003302__860.json | 25.0 | missing | missing | missing | |
| 11181 | Apple-MacBook-Pro-M1 | timezone_bumper | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240224_003326__618 | 0 | 0.0 | 24.1226 | 0 | [0, 366] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240224_003326__618.json | 25.0 | missing | missing | missing | |
| 11182 | Apple-MacBook-Pro-M1 | timezone_bumper | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20240224_003349__377 | 0 | 0.0 | 23.196 | 0 | [0, 356] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240224_003349__377.json | 25.0 | missing | missing | missing | |
| 11183 | Apple-MacBook-Pro-M1 | timezone_bumper | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240224_003410__257 | 1 | 0.0 | 20.6762 | 0 | [0, 316] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240224_003410__257.json | 55.0 | missing | missing | missing | |
| 11184 | Apple-MacBook-Pro-M1 | timezone_bumper | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240224_004707__914 | 1 | 0.0 | 28.0103 | 0 | [0, 297] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240224_004707__914.json | 55.0 | missing | missing | missing | |
| 11185 | Apple-MacBook-Pro-M1 | timezone_bumper | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240224_004748__414 | 1 | 0.0 | 41.4948 | 0 | [0, 439] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240224_004748__414.json | 55.0 | missing | missing | missing | |
| 11186 | Apple-MacBook-Pro-M1 | timezone_bumper | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240224_004825__263 | 1 | 0.0 | 37.1198 | 0 | [0, 393] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240224_004825__263.json | 55.0 | missing | missing | missing | |
| 11187 | Apple-MacBook-Pro-M1 | timezone_bumper | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240224_004846__713 | 1 | 0.0 | 20.3551 | 0 | [0, 216] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240224_004846__713.json | 55.0 | missing | missing | missing | |
| 11188 | Apple-MacBook-Pro-M1 | timezone_bumper | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240224_004918__603 | 1 | 0.0 | 32.3181 | 0 | [0, 343] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240224_004918__603.json | 55.0 | missing | missing | missing | |
| 11189 | Apple-MacBook-Pro-M1 | timezone_bumper | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240224_004139__389 | 1 | 0.0 | 37.5505 | 0 | [0, 397] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240224_004139__389.json | 55.0 | missing | missing | missing | |
| 11190 | Apple-MacBook-Pro-M1 | timezone_bumper | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240224_004216__312 | 1 | 0.0 | 36.8712 | 0 | [0, 388] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240224_004216__312.json | 55.0 | missing | missing | missing | |
| 11191 | Apple-MacBook-Pro-M1 | timezone_bumper | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240224_004257__741 | 1 | 0.0 | 41.1355 | 0 | [0, 433] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240224_004257__741.json | 55.0 | missing | missing | missing | |
| 11192 | Apple-MacBook-Pro-M1 | timezone_bumper | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240224_004319__703 | 1 | 0.0 | 22.314 | 0 | [0, 236] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240224_004319__703.json | 55.0 | missing | missing | missing | |
| 11193 | Apple-MacBook-Pro-M1 | timezone_bumper | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20240224_004349__563 | 1 | 0.0 | 29.6545 | 0 | [0, 313] | 0.13.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240224_004349__563.json | 55.0 | missing | missing | missing | |
| 11194 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | AsIs | 1SHOT | true | true | 5 | 20231213_205614__807 | 5 | 0.0006035 | 7.48748 | 5 | [64, 381] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231213_205614__807.json | 100.0 | missing | missing | missing | |
| 11195 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | AsIs | 1SHOT | true | true | 5 | 20231225_230816__560 | 5 | 0.0002975 | 2.37332 | 5 | [64, 177] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_230816__560.json | 100.0 | missing | missing | missing | |
| 11196 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | AsIs | 1SHOT | true | true | 5 | 20231225_230819__248 | 5 | 0.000383 | 3.18166 | 5 | [64, 234] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_230819__248.json | 100.0 | missing | missing | missing | |
| 11197 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo--optim | AsIs | 1SHOT | true | true | 5 | 20231215_201737__451 | 5 | 0.0 | 4.5297 | 5 | [64, 177] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231215_201737__451.json | 100.0 | 0.5 | missing | 0.5 | |
| 11198 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231213_205607__884 | 5 | 0.0004295 | 6.12347 | 5 | [67, 264] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231213_205607__884.json | 100.0 | missing | missing | missing | |
| 11199 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231225_230808__935 | 5 | 0.0006965 | 6.44397 | 5 | [67, 442] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_230808__935.json | 100.0 | missing | missing | missing | |
| 11200 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231225_230813__315 | 5 | 0.000572 | 5.55891 | 5 | [67, 359] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_230813__315.json | 100.0 | missing | missing | missing | |
| 11201 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231227_205657__306 | 5 | 0.0005315 | 4.97268 | 5 | [67, 332] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_205657__306.json | 100.0 | missing | missing | missing | |
| 11202 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 5 | 20231227_205701__436 | 5 | 0.0005165 | 4.53477 | 5 | [67, 322] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_205701__436.json | 100.0 | missing | missing | missing | |
| 11203 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo--optim | InJulia | 1SHOT | true | true | 5 | 20231215_201732__871 | 5 | 0.0 | 7.15372 | 5 | [67, 344] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231215_201732__871.json | 100.0 | 0.5 | missing | 0.5 | |
| 11204 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231213_205601__295 | 0 | 0.000135 | 1.75958 | 0 | [102, 56] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231213_205601__295.json | 0.0 | missing | missing | missing | |
| 11205 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_230758__623 | 5 | 0.0003435 | 3.3706 | 5 | [102, 195] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_230758__623.json | 100.0 | missing | missing | missing | |
| 11206 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_230801__374 | 5 | 0.0002955 | 2.675 | 5 | [102, 163] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_230801__374.json | 100.0 | missing | missing | missing | |
| 11207 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_205650__195 | 5 | 0.000255 | 2.39685 | 5 | [102, 136] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_205650__195.json | 100.0 | missing | missing | missing | |
| 11208 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_205652__146 | 5 | 0.0001455 | 1.50978 | 5 | [102, 63] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_205652__146.json | 100.0 | missing | missing | missing | |
| 11209 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_201725__946 | 5 | 0.0 | 4.19875 | 5 | [102, 146] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231215_201725__946.json | 100.0 | 0.5 | missing | 0.5 | |
| 11210 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_205559__556 | 0 | 0.0002855 | 3.18449 | 0 | [181, 130] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231213_205559__556.json | 0.0 | missing | missing | missing | |
| 11211 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_230754__812 | 0 | 0.0002195 | 1.45885 | 0 | [181, 86] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_230754__812.json | 0.0 | missing | missing | missing | |
| 11212 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_230755__565 | 0 | 0.0001745 | 1.27111 | 0 | [181, 56] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_230755__565.json | 0.0 | missing | missing | missing | |
| 11213 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_205646__483 | 5 | 0.0001355 | 1.17074 | 5 | [181, 30] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_205646__483.json | 100.0 | missing | missing | missing | |
| 11214 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_205648__307 | 0 | 0.0001955 | 1.60891 | 0 | [181, 70] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_205648__307.json | 0.0 | missing | missing | missing | |
| 11215 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_201721__254 | 5 | 0.0 | 4.25654 | 5 | [181, 200] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231215_201721__254.json | 100.0 | 0.5 | missing | 0.5 | |
| 11216 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231213_205620__325 | 0 | 0.000317 | 3.30589 | 0 | [325, 103] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231213_205620__325.json | 0.0 | missing | missing | missing | |
| 11217 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_230825__177 | 5 | 0.0004925 | 3.20714 | 5 | [325, 220] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_230825__177.json | 100.0 | missing | missing | missing | |
| 11218 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_230828__332 | 0 | 0.0002555 | 2.73541 | 0 | [325, 62] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_230828__332.json | 0.0 | missing | missing | missing | |
| 11219 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_205710__381 | 5 | 0.000716 | 5.85177 | 5 | [325, 369] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_205710__381.json | 100.0 | missing | missing | missing | |
| 11220 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_205712__509 | 0 | 0.00026 | 1.5133 | 0 | [325, 65] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_205712__509.json | 0.0 | missing | missing | missing | |
| 11221 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_201744__520 | 5 | 0.0 | 5.4069 | 5 | [325, 244] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231215_201744__520.json | 100.0 | 0.5 | missing | 0.5 | |
| 11222 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_205617__158 | 0 | 0.00027 | 2.31198 | 0 | [324, 72] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231213_205617__158.json | 0.0 | missing | missing | missing | |
| 11223 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_230820__885 | 0 | 0.0002145 | 1.04022 | 0 | [324, 35] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_230820__885.json | 0.0 | missing | missing | missing | |
| 11224 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_230822__474 | 0 | 0.00027 | 1.40972 | 0 | [324, 72] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_230822__474.json | 0.0 | missing | missing | missing | |
| 11225 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_205703__702 | 0 | 0.000348 | 1.97856 | 0 | [324, 124] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_205703__702.json | 0.0 | missing | missing | missing | |
| 11226 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_205704__401 | 0 | 0.000246 | 1.13002 | 0 | [324, 56] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_205704__401.json | 0.0 | missing | missing | missing | |
| 11227 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo--optim | JuliaRecapTask | 1SHOT | false | false | 5 | 20231215_201739__490 | 0 | 0.0 | 1.61395 | 0 | [324, 70] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231215_201739__490.json | 0.0 | 0.5 | missing | 0.5 | |
| 11228 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200713__496 | 5 | 0.00032 | 1.3945 | 5 | [67, 191] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200713__496.json | 100.0 | missing | missing | missing | |
| 11229 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200714__915 | 5 | 0.000215 | 1.06776 | 5 | [67, 121] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200714__915.json | 100.0 | missing | missing | missing | |
| 11230 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200715__389 | 1 | 0.000242 | 1.08147 | 0 | [67, 139] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200715__389.json | 55.0 | missing | missing | missing | |
| 11231 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200717__326 | 5 | 0.0002315 | 1.29596 | 5 | [67, 132] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200717__326.json | 100.0 | missing | missing | missing | |
| 11232 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 5 | 20240201_200718__765 | 5 | 0.000254 | 1.35735 | 5 | [67, 147] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200718__765.json | 100.0 | missing | missing | missing | |
| 11233 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200708__436 | 5 | 0.000156 | 0.802506 | 5 | [102, 70] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200708__436.json | 100.0 | missing | missing | missing | |
| 11234 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200709__595 | 5 | 0.0001575 | 0.975949 | 5 | [102, 71] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200709__595.json | 100.0 | missing | missing | missing | |
| 11235 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200710__581 | 5 | 0.0001485 | 0.857541 | 5 | [102, 65] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200710__581.json | 100.0 | missing | missing | missing | |
| 11236 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_200711__167 | 5 | 0.00015 | 0.880111 | 5 | [102, 66] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200711__167.json | 100.0 | missing | missing | missing | |
| 11237 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20240201_200712__797 | 0 | 0.000204 | 0.857071 | 0 | [102, 102] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200712__797.json | 25.0 | missing | missing | missing | |
| 11238 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200704__407 | 5 | 0.0001775 | 0.852703 | 5 | [181, 58] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200704__407.json | 100.0 | missing | missing | missing | |
| 11239 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200704__769 | 5 | 0.0002015 | 0.687919 | 5 | [181, 74] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200704__769.json | 100.0 | missing | missing | missing | |
| 11240 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200705__293 | 5 | 0.000203 | 0.967852 | 5 | [181, 75] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200705__293.json | 100.0 | missing | missing | missing | |
| 11241 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200706__112 | 5 | 0.0002015 | 0.684518 | 5 | [181, 74] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200706__112.json | 100.0 | missing | missing | missing | |
| 11242 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_200707__609 | 5 | 0.000209 | 0.94035 | 5 | [181, 79] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200707__609.json | 100.0 | missing | missing | missing | |
| 11243 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200724__578 | 0 | 0.0003215 | 1.14825 | 0 | [325, 106] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200724__578.json | 0.0 | missing | missing | missing | |
| 11244 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20240201_200727__160 | 0 | 0.0007055 | 2.46711 | 0 | [325, 362] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200727__160.json | 25.0 | missing | missing | missing | |
| 11245 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200728__923 | 0 | 0.0003245 | 1.01788 | 0 | [325, 108] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200728__923.json | 0.0 | missing | missing | missing | |
| 11246 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200730__496 | 0 | 0.0003725 | 2.18971 | 0 | [325, 140] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200730__496.json | 0.0 | missing | missing | missing | |
| 11247 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_200731__562 | 0 | 0.000299 | 1.02379 | 0 | [325, 91] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200731__562.json | 0.0 | missing | missing | missing | |
| 11248 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200720__125 | 5 | 0.0004935 | 1.90201 | 5 | [324, 221] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200720__125.json | 100.0 | missing | missing | missing | |
| 11249 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200721__142 | 2 | 0.0002535 | 0.840905 | 1 | [324, 61] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200721__142.json | 65.0 | missing | missing | missing | |
| 11250 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200722__460 | 5 | 0.000273 | 0.69471 | 5 | [324, 74] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200722__460.json | 100.0 | missing | missing | missing | |
| 11251 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200723__215 | 5 | 0.000264 | 0.665011 | 5 | [324, 68] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200723__215.json | 100.0 | missing | missing | missing | |
| 11252 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_200723__561 | 5 | 0.00021 | 0.451165 | 5 | [324, 32] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200723__561.json | 100.0 | missing | missing | missing | |
| 11253 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | AsIs | 1SHOT | true | true | 5 | 20231213_205631__959 | 5 | 0.000588 | 5.44866 | 5 | [64, 262] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231213_205631__959.json | 100.0 | missing | missing | missing | |
| 11254 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | AsIs | 1SHOT | true | true | 5 | 20231225_230837__259 | 5 | 0.0004 | 1.9095 | 5 | [64, 168] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_230837__259.json | 100.0 | missing | missing | missing | |
| 11255 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | AsIs | 1SHOT | true | true | 5 | 20231225_230840__959 | 5 | 0.000426 | 2.13197 | 5 | [64, 181] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_230840__959.json | 100.0 | missing | missing | missing | |
| 11256 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106--optim | AsIs | 1SHOT | true | true | 5 | 20231215_201754__290 | 5 | 0.0 | 2.63893 | 5 | [64, 167] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231215_201754__290.json | 100.0 | 0.9 | missing | 0.1 | |
| 11257 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231213_205626__593 | 5 | 0.000361 | 2.4218 | 5 | [67, 147] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231213_205626__593.json | 100.0 | missing | missing | missing | |
| 11258 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231225_230834__165 | 5 | 0.000375 | 1.78143 | 5 | [67, 154] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_230834__165.json | 100.0 | missing | missing | missing | |
| 11259 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231225_230835__729 | 5 | 0.000315 | 1.44557 | 5 | [67, 124] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_230835__729.json | 100.0 | missing | missing | missing | |
| 11260 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231227_205718__290 | 5 | 0.000325 | 2.35574 | 5 | [67, 129] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_205718__290.json | 100.0 | missing | missing | missing | |
| 11261 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 5 | 20231227_205720__126 | 5 | 0.000311 | 1.84463 | 5 | [67, 122] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_205720__126.json | 100.0 | missing | missing | missing | |
| 11262 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106--optim | InJulia | 1SHOT | true | true | 5 | 20231215_201751__965 | 5 | 0.0 | 4.22067 | 5 | [67, 165] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231215_201751__965.json | 100.0 | 0.9 | missing | 0.1 | |
| 11263 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_205623__469 | 5 | 0.00016 | 1.26784 | 5 | [102, 29] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231213_205623__469.json | 100.0 | missing | missing | missing | |
| 11264 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_230831__621 | 5 | 0.000222 | 1.04633 | 5 | [102, 60] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_230831__621.json | 100.0 | missing | missing | missing | |
| 11265 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_230832__386 | 5 | 0.000166 | 1.15946 | 5 | [102, 32] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_230832__386.json | 100.0 | missing | missing | missing | |
| 11266 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_205715__776 | 5 | 0.000224 | 1.13231 | 5 | [102, 61] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_205715__776.json | 100.0 | missing | missing | missing | |
| 11267 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_205716__174 | 5 | 0.00016 | 0.779533 | 5 | [102, 29] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_205716__174.json | 100.0 | missing | missing | missing | |
| 11268 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_201747__725 | 5 | 0.0 | 0.967359 | 5 | [102, 29] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231215_201747__725.json | 100.0 | 0.9 | missing | 0.1 | |
| 11269 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_205622__372 | 5 | 0.000239 | 1.63346 | 5 | [181, 29] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231213_205622__372.json | 100.0 | missing | missing | missing | |
| 11270 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_230829__302 | 5 | 0.000353 | 1.08916 | 5 | [181, 86] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_230829__302.json | 100.0 | missing | missing | missing | |
| 11271 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_230830__109 | 5 | 0.000239 | 0.706518 | 5 | [181, 29] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_230830__109.json | 100.0 | missing | missing | missing | |
| 11272 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_205713__398 | 5 | 0.000239 | 0.737521 | 5 | [181, 29] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_205713__398.json | 100.0 | missing | missing | missing | |
| 11273 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_205714__182 | 5 | 0.000259 | 0.984686 | 5 | [181, 39] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_205714__182.json | 100.0 | missing | missing | missing | |
| 11274 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_201746__799 | 5 | 0.0 | 1.8109 | 5 | [181, 29] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231215_201746__799.json | 100.0 | 0.9 | missing | 0.1 | |
| 11275 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_205638__124 | 5 | 0.000719 | 3.98159 | 5 | [325, 197] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231213_205638__124.json | 100.0 | missing | missing | missing | |
| 11276 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_230843__271 | 0 | 0.000513 | 1.51943 | 0 | [325, 94] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_230843__271.json | 0.0 | missing | missing | missing | |
| 11277 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_230844__978 | 0 | 0.000557 | 1.33201 | 0 | [325, 116] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_230844__978.json | 0.0 | missing | missing | missing | |
| 11278 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_205953__253 | 0 | 0.000511 | 1.54252 | 0 | [325, 93] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_205953__253.json | 0.0 | missing | missing | missing | |
| 11279 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_205954__928 | 0 | 0.000553 | 1.56771 | 0 | [325, 114] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_205954__928.json | 0.0 | missing | missing | missing | |
| 11280 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106--optim | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231215_201758__422 | 0 | 0.0 | 2.6787 | 0 | [325, 98] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231215_201758__422.json | 0.0 | 0.9 | missing | 0.1 | |
| 11281 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_205634__129 | 0 | 0.000564 | 3.11527 | 0 | [324, 120] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231213_205634__129.json | 0.0 | missing | missing | missing | |
| 11282 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_230841__261 | 5 | 0.00045 | 1.0254 | 5 | [324, 63] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_230841__261.json | 100.0 | missing | missing | missing | |
| 11283 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_230842__921 | 5 | 0.000398 | 0.770204 | 5 | [324, 37] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_230842__921.json | 100.0 | missing | missing | missing | |
| 11284 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_205721__994 | 5 | 0.0004 | 0.891873 | 5 | [324, 38] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_205721__994.json | 100.0 | missing | missing | missing | |
| 11285 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-3.5-turbo-1106--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_201756__901 | 5 | 0.0 | 1.56627 | 5 | [324, 37] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231215_201756__901.json | 100.0 | 0.9 | missing | 0.1 | |
| 11286 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_114207__802 | 5 | 0.01699 | 37.7352 | 5 | [67, 544] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_114207__802.json | 100.0 | missing | missing | missing | |
| 11287 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_114236__454 | 5 | 0.01453 | 28.1811 | 5 | [67, 462] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_114236__454.json | 100.0 | missing | missing | missing | |
| 11288 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_114314__604 | 5 | 0.01477 | 38.7421 | 5 | [67, 470] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_114314__604.json | 100.0 | missing | missing | missing | |
| 11289 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-0125-preview | InJulia | 1SHOT | true | false | 5 | 20240201_114405__622 | 0 | 0.01546 | 50.1211 | 0 | [67, 493] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_114405__622.json | 25.0 | missing | missing | missing | |
| 11290 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 5 | 20240201_114443__139 | 5 | 0.01528 | 38.2851 | 5 | [67, 487] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_114443__139.json | 100.0 | missing | missing | missing | |
| 11291 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_113807__546 | 5 | 0.00294 | 2.90056 | 5 | [102, 64] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_113807__546.json | 100.0 | missing | missing | missing | |
| 11292 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_113816__927 | 5 | 0.00537 | 8.86228 | 5 | [102, 145] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_113816__927.json | 100.0 | missing | missing | missing | |
| 11293 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_113822__918 | 5 | 0.00321 | 5.87966 | 5 | [102, 73] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_113822__918.json | 100.0 | missing | missing | missing | |
| 11294 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_113827__779 | 5 | 0.00318 | 4.55099 | 5 | [102, 72] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_113827__779.json | 100.0 | missing | missing | missing | |
| 11295 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20240201_113835__875 | 5 | 0.00354 | 7.89028 | 5 | [102, 84] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_113835__875.json | 100.0 | missing | missing | missing | |
| 11296 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_113641__898 | 5 | 0.0112 | 14.5249 | 5 | [181, 313] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_113641__898.json | 100.0 | missing | missing | missing | |
| 11297 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_113647__219 | 5 | 0.00367 | 5.89014 | 5 | [181, 62] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_113647__219.json | 100.0 | missing | missing | missing | |
| 11298 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_113656__734 | 5 | 0.00679 | 8.97409 | 5 | [181, 166] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_113656__734.json | 100.0 | missing | missing | missing | |
| 11299 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_113717__590 | 5 | 0.01156 | 20.2157 | 5 | [181, 325] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_113717__590.json | 100.0 | missing | missing | missing | |
| 11300 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20240201_113738__723 | 5 | 0.00913 | 21.4182 | 5 | [181, 244] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_113738__723.json | 100.0 | missing | missing | missing | |
| 11301 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_115425__999 | 5 | 0.01693 | 49.5213 | 5 | [325, 456] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_115425__999.json | 100.0 | missing | missing | missing | |
| 11302 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20240201_115459__640 | 0 | 0.01831 | 34.3684 | 0 | [325, 502] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_115459__640.json | 0.0 | missing | missing | missing | |
| 11303 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_115514__608 | 5 | 0.01039 | 14.6049 | 5 | [325, 238] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_115514__608.json | 100.0 | missing | missing | missing | |
| 11304 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_115604__279 | 5 | 0.0214 | 49.7423 | 5 | [325, 605] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_115604__279.json | 100.0 | missing | missing | missing | |
| 11305 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20240201_115630__375 | 5 | 0.01597 | 25.8203 | 5 | [325, 424] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_115630__375.json | 100.0 | missing | missing | missing | |
| 11306 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_114807__371 | 5 | 0.02097 | 51.0936 | 5 | [324, 591] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_114807__371.json | 100.0 | missing | missing | missing | |
| 11307 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_114900__911 | 5 | 0.01767 | 53.3794 | 5 | [324, 481] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_114900__911.json | 100.0 | missing | missing | missing | |
| 11308 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_114938__850 | 5 | 0.02133 | 38.3067 | 5 | [324, 603] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_114938__850.json | 100.0 | missing | missing | missing | |
| 11309 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_115004__180 | 5 | 0.01347 | 24.994 | 5 | [324, 341] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_115004__180.json | 100.0 | missing | missing | missing | |
| 11310 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20240201_115044__368 | 5 | 0.02175 | 40.2404 | 5 | [324, 617] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_115044__368.json | 100.0 | missing | missing | missing | |
| 11311 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | AsIs | 1SHOT | true | true | 5 | 20231213_205909__233 | 5 | 0.0136 | 49.416 | 5 | [64, 432] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231213_205909__233.json | 100.0 | missing | missing | missing | |
| 11312 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | AsIs | 1SHOT | true | true | 5 | 20231225_231000__358 | 5 | 0.01219 | 16.1331 | 5 | [64, 385] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_231000__358.json | 100.0 | missing | missing | missing | |
| 11313 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | AsIs | 1SHOT | true | true | 5 | 20231225_231015__806 | 5 | 0.01399 | 14.5411 | 5 | [64, 445] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_231015__806.json | 100.0 | missing | missing | missing | |
| 11314 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview--optim | AsIs | 1SHOT | true | false | 5 | 20231215_201944__962 | 0 | 0.0 | 27.3772 | 0 | [64, 325] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231215_201944__962.json | 25.0 | 0.1 | missing | 0.9 | |
| 11315 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | InJulia | 1SHOT | false | false | 5 | 20231213_205819__798 | 0 | 0.01957 | 54.6329 | 0 | [67, 630] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231213_205819__798.json | 0.0 | missing | missing | missing | |
| 11316 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231225_230927__976 | 5 | 0.01252 | 12.5294 | 5 | [67, 395] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_230927__976.json | 100.0 | missing | missing | missing | |
| 11317 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231225_230944__977 | 5 | 0.01315 | 17.0163 | 5 | [67, 416] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_230944__977.json | 100.0 | missing | missing | missing | |
| 11318 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 5 | 20231227_210130__664 | 5 | 0.01243 | 41.0691 | 5 | [67, 392] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_210130__664.json | 100.0 | missing | missing | missing | |
| 11319 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | InJulia | 1SHOT | true | false | 5 | 20231227_210154__183 | 0 | 0.01336 | 23.9983 | 0 | [67, 423] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_210154__183.json | 25.0 | missing | missing | missing | |
| 11320 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview--optim | InJulia | 1SHOT | true | true | 5 | 20231215_201916__961 | 5 | 0.0 | 31.9782 | 5 | [67, 435] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231215_201916__961.json | 100.0 | 0.1 | missing | 0.9 | |
| 11321 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_205724__713 | 5 | 0.00564 | 24.5274 | 5 | [102, 154] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231213_205724__713.json | 100.0 | missing | missing | missing | |
| 11322 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_230910__882 | 5 | 0.00438 | 4.11661 | 5 | [102, 112] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_230910__882.json | 100.0 | missing | missing | missing | |
| 11323 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_230914__642 | 5 | 0.00528 | 4.24127 | 5 | [102, 142] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_230914__642.json | 100.0 | missing | missing | missing | |
| 11324 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_210037__510 | 5 | 0.00543 | 11.9831 | 5 | [102, 147] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_210037__510.json | 100.0 | missing | missing | missing | |
| 11325 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_210049__818 | 5 | 0.00534 | 11.0416 | 5 | [102, 144] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_210049__818.json | 100.0 | missing | missing | missing | |
| 11326 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_201844__471 | 5 | 0.0 | 12.502 | 5 | [102, 134] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231215_201844__471.json | 100.0 | 0.1 | missing | 0.9 | |
| 11327 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_205700__502 | 5 | 0.00871 | 21.3544 | 5 | [181, 230] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231213_205700__502.json | 100.0 | missing | missing | missing | |
| 11328 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_230854__848 | 5 | 0.00922 | 9.46398 | 5 | [181, 247] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_230854__848.json | 100.0 | missing | missing | missing | |
| 11329 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_230905__849 | 5 | 0.01027 | 11.3871 | 5 | [181, 282] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_230905__849.json | 100.0 | missing | missing | missing | |
| 11330 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_210009__698 | 5 | 0.01063 | 15.2892 | 5 | [181, 294] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_210009__698.json | 100.0 | missing | missing | missing | |
| 11331 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_210025__553 | 5 | 0.01006 | 15.7866 | 5 | [181, 275] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_210025__553.json | 100.0 | missing | missing | missing | |
| 11332 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_201832__720 | 5 | 0.0 | 33.4024 | 5 | [181, 246] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231215_201832__720.json | 100.0 | 0.1 | missing | 0.9 | |
| 11333 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_210005__347 | 5 | 0.01276 | 21.7908 | 5 | [325, 317] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231213_210005__347.json | 100.0 | missing | missing | missing | |
| 11334 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_231110__367 | 5 | 0.01927 | 19.2829 | 5 | [325, 534] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_231110__367.json | 100.0 | missing | missing | missing | |
| 11335 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_231135__150 | 5 | 0.01966 | 23.9667 | 5 | [325, 547] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_231135__150.json | 100.0 | missing | missing | missing | |
| 11336 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_210340__721 | 5 | 0.0166 | 26.3096 | 5 | [325, 445] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_210340__721.json | 100.0 | missing | missing | missing | |
| 11337 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_210404__259 | 5 | 0.01294 | 24.0665 | 5 | [325, 323] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_210404__259.json | 100.0 | missing | missing | missing | |
| 11338 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_202112__317 | 5 | 0.0 | 39.0214 | 5 | [325, 451] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231215_202112__317.json | 100.0 | 0.1 | missing | 0.9 | |
| 11339 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_205943__113 | 5 | 0.01428 | 33.9524 | 5 | [324, 368] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231213_205943__113.json | 100.0 | missing | missing | missing | |
| 11340 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_231035__208 | 5 | 0.01533 | 20.132 | 5 | [324, 403] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_231035__208.json | 100.0 | missing | missing | missing | |
| 11341 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_231051__605 | 5 | 0.01701 | 16.0092 | 5 | [324, 459] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_231051__605.json | 100.0 | missing | missing | missing | |
| 11342 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_210235__428 | 5 | 0.01809 | 40.972 | 5 | [324, 495] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_210235__428.json | 100.0 | missing | missing | missing | |
| 11343 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_210313__500 | 0 | 0.0165 | 38.2931 | 0 | [324, 442] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_210313__500.json | 25.0 | missing | missing | missing | |
| 11344 | Apple-MacBook-Pro-M1 | timezone_bumper | gpt-4-1106-preview--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_202033__545 | 5 | 0.0 | 48.703 | 5 | [324, 479] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231215_202033__545.json | 100.0 | 0.1 | missing | 0.9 | |
| 11345 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | AsIs | 1SHOT | false | false | 5 | 20231214_084337__616 | 0 | 0.0 | 10.463 | 0 | [57, 317] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__AsIs__1SHOT__20231214_084337__616.json | 0.0 | missing | missing | missing | |
| 11346 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | AsIs | 1SHOT | false | false | 5 | 20231225_084054__979 | 0 | 0.0 | 11.6292 | 0 | [57, 350] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__AsIs__1SHOT__20231225_084054__979.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11347 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | AsIs | 1SHOT | false | false | 5 | 20231225_084105__732 | 0 | 0.0 | 10.4463 | 0 | [1, 327] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__AsIs__1SHOT__20231225_084105__732.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11348 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | InJulia | 1SHOT | true | true | 5 | 20231214_084327__135 | 1 | 0.0 | 12.0294 | 0 | [74, 359] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__InJulia__1SHOT__20231214_084327__135.json | 55.0 | missing | missing | missing | |
| 11349 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | InJulia | 1SHOT | false | false | 5 | 20231225_084032__375 | 0 | 0.0 | 13.8508 | 0 | [74, 410] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__InJulia__1SHOT__20231225_084032__375.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11350 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | InJulia | 1SHOT | true | true | 5 | 20231225_084042__784 | 1 | 0.0 | 10.549 | 0 | [1, 330] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__InJulia__1SHOT__20231225_084042__784.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11351 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | InJulia | 1SHOT | true | true | 5 | 20231227_084652__623 | 1 | 0.0 | 15.3382 | 0 | [74, 461] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__InJulia__1SHOT__20231227_084652__623.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11352 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_084314__641 | 1 | 0.0 | 9.36455 | 0 | [103, 268] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaExpertAsk__1SHOT__20231214_084314__641.json | 55.0 | missing | missing | missing | |
| 11353 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_084009__406 | 1 | 0.0 | 7.60119 | 0 | [103, 215] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_084009__406.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11354 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_084018__266 | 1 | 0.0 | 7.54759 | 0 | [1, 235] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_084018__266.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11355 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_084636__917 | 1 | 0.0 | 8.14064 | 0 | [103, 236] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaExpertAsk__1SHOT__20231227_084636__917.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11356 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_084305__284 | 1 | 0.0 | 22.7925 | 0 | [201, 607] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_084305__284.json | 55.0 | missing | missing | missing | |
| 11357 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_083948__102 | 1 | 0.0 | 30.4059 | 0 | [219, 667] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_083948__102.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11358 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_084001__821 | 1 | 0.0 | 12.9921 | 0 | [1, 381] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_084001__821.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11359 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_084628__220 | 1 | 0.0 | 25.3701 | 0 | [219, 554] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_084628__220.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11360 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_084419__377 | 0 | 0.0 | 24.3539 | 0 | [11, 650] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_084419__377.json | 25.0 | missing | missing | missing | |
| 11361 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_084207__439 | 1 | 0.0 | 23.377 | 0 | [11, 628] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_084207__439.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11362 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_084229__388 | 1 | 0.0 | 21.6094 | 0 | [1, 588] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_084229__388.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11363 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_084733__353 | 1 | 0.0 | 21.6528 | 0 | [11, 590] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_084733__353.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11364 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_084355__609 | 0 | 0.0 | 17.6727 | 0 | [374, 400] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaRecapTask__1SHOT__20231214_084355__609.json | 0.0 | missing | missing | missing | |
| 11365 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_084129__253 | 1 | 0.0 | 24.017 | 0 | [374, 564] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_084129__253.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11366 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_084144__999 | 1 | 0.0 | 15.0735 | 0 | [1, 420] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_084144__999.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11367 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_084711__425 | 1 | 0.0 | 19.3638 | 0 | [374, 451] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaRecapTask__1SHOT__20231227_084711__425.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11368 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | AsIs | 1SHOT | false | false | 5 | 20231214_085327__828 | 0 | 0.0 | 15.097 | 0 | [57, 452] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__AsIs__1SHOT__20231214_085327__828.json | 0.0 | missing | missing | missing | |
| 11369 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | AsIs | 1SHOT | true | true | 5 | 20231225_090147__945 | 2 | 0.0 | 5.15935 | 1 | [71, 165] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__AsIs__1SHOT__20231225_090147__945.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11370 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | AsIs | 1SHOT | true | true | 5 | 20231225_090153__333 | 5 | 0.0 | 5.13528 | 5 | [71, 165] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__AsIs__1SHOT__20231225_090153__333.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11371 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | InJulia | 1SHOT | true | true | 5 | 20231214_085312__595 | 1 | 0.0 | 10.4409 | 0 | [74, 312] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__InJulia__1SHOT__20231214_085312__595.json | 55.0 | missing | missing | missing | |
| 11372 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_090137__868 | 5 | 0.0 | 8.57739 | 5 | [74, 283] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__InJulia__1SHOT__20231225_090137__868.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11373 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_090142__349 | 2 | 0.0 | 5.03804 | 1 | [74, 161] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__InJulia__1SHOT__20231225_090142__349.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11374 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | InJulia | 1SHOT | true | false | 5 | 20231227_085758__238 | 0 | 0.0 | 6.25118 | 0 | [74, 201] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__InJulia__1SHOT__20231227_085758__238.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11375 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_085301__310 | 0 | 0.0 | 11.6111 | 0 | [103, 334] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231214_085301__310.json | 0.0 | missing | missing | missing | |
| 11376 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_090121__915 | 2 | 0.0 | 8.94039 | 1 | [113, 289] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_090121__915.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11377 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_090128__571 | 5 | 0.0 | 7.71555 | 5 | [113, 247] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_090128__571.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11378 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_085751__326 | 5 | 0.0 | 6.65742 | 5 | [113, 209] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231227_085751__326.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11379 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_085250__755 | 0 | 0.0 | 17.5646 | 0 | [201, 467] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231214_085250__755.json | 0.0 | missing | missing | missing | |
| 11380 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_090104__646 | 5 | 0.0 | 16.2405 | 5 | [211, 314] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_090104__646.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11381 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_090112__359 | 5 | 0.0 | 7.44667 | 5 | [211, 221] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_090112__359.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11382 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_085745__416 | 5 | 0.0 | 14.1651 | 5 | [211, 250] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231227_085745__416.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11383 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_085406__806 | 0 | 0.0 | 21.0846 | 0 | [11, 562] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231214_085406__806.json | 0.0 | missing | missing | missing | |
| 11384 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_090229__805 | 2 | 0.0 | 12.479 | 1 | [377, 355] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_090229__805.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11385 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_090237__643 | 5 | 0.0 | 8.09549 | 5 | [377, 214] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_090237__643.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11386 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_085819__863 | 2 | 0.0 | 10.1473 | 1 | [377, 278] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231227_085819__863.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11387 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_085345__247 | 1 | 0.0 | 18.3237 | 0 | [374, 417] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaRecapTask__1SHOT__20231214_085345__247.json | 55.0 | missing | missing | missing | |
| 11388 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_090203__532 | 2 | 0.0 | 9.94566 | 1 | [374, 274] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_090203__532.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11389 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_090216__186 | 0 | 0.0 | 13.1322 | 0 | [374, 376] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_090216__186.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11390 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_085808__144 | 2 | 0.0 | 10.7377 | 1 | [374, 297] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaRecapTask__1SHOT__20231227_085808__144.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11391 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_183057__907 | 2 | 0.0 | 14.9717 | 1 | [74, 289] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_183057__907.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11392 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_183115__631 | 2 | 0.0 | 17.7507 | 1 | [74, 344] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_183115__631.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11393 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_183122__490 | 2 | 0.0 | 7.11959 | 1 | [74, 133] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_183122__490.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11394 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_183009__206 | 2 | 0.0 | 10.5485 | 1 | [113, 185] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_183009__206.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11395 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_183025__114 | 2 | 0.0 | 15.2528 | 1 | [113, 290] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_183025__114.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11396 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_183042__884 | 0 | 0.0 | 16.988 | 0 | [113, 324] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_183042__884.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11397 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_182930__103 | 5 | 0.0 | 16.6186 | 5 | [211, 304] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_182930__103.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11398 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_182946__126 | 2 | 0.0 | 15.6496 | 1 | [211, 284] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_182946__126.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11399 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_182959__157 | 2 | 0.0 | 13.1617 | 1 | [211, 230] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_182959__157.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11400 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_183220__585 | 2 | 0.0 | 10.0225 | 1 | [377, 154] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_183220__585.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11401 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_183238__407 | 2 | 0.0 | 17.5141 | 1 | [377, 301] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_183238__407.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11402 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_183256__929 | 0 | 0.0 | 18.3406 | 0 | [377, 315] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_183256__929.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11403 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_183138__754 | 0 | 0.0 | 16.6935 | 0 | [374, 285] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_183138__754.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11404 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_183155__530 | 5 | 0.0 | 16.4999 | 5 | [374, 280] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_183155__530.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11405 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_183210__513 | 2 | 0.0 | 14.6743 | 1 | [374, 246] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_183210__513.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11406 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | AsIs | 1SHOT | true | true | 5 | 20231213_210201__152 | 5 | 0.00140789 | 3.45364 | 5 | [69, 151] | 0.10.0-DEV | 5 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__AsIs__1SHOT__20231213_210201__152.json | 100.0 | missing | missing | missing | |
| 11407 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | AsIs | 1SHOT | true | true | 5 | 20231225_231410__310 | 5 | 0.00169913 | 6.61237 | 5 | [69, 187] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__AsIs__1SHOT__20231225_231410__310.json | 100.0 | missing | missing | missing | |
| 11408 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | AsIs | 1SHOT | true | true | 5 | 20231225_231427__469 | 5 | 0.00323623 | 16.187 | 5 | [69, 377] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__AsIs__1SHOT__20231225_231427__469.json | 100.0 | missing | missing | missing | |
| 11409 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium--optim | AsIs | 1SHOT | true | true | 5 | 20231215_202323__141 | 5 | 0.0 | 22.8691 | 5 | [69, 327] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__AsIs__1SHOT__20231215_202323__141.json | 100.0 | 0.9 | missing | 0.3 | |
| 11410 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231213_210157__366 | 4 | 0.00343849 | 8.85062 | 4 | [72, 401] | 0.10.0-DEV | 5 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__InJulia__1SHOT__20231213_210157__366.json | 90.0 | missing | missing | missing | |
| 11411 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231225_231353__563 | 5 | 0.00295309 | 7.63833 | 5 | [72, 341] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__InJulia__1SHOT__20231225_231353__563.json | 100.0 | missing | missing | missing | |
| 11412 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231225_231404__298 | 5 | 0.00340613 | 10.436 | 5 | [72, 397] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__InJulia__1SHOT__20231225_231404__298.json | 100.0 | missing | missing | missing | |
| 11413 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231227_210621__737 | 5 | 0.00305826 | 7.90826 | 5 | [72, 354] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__InJulia__1SHOT__20231227_210621__737.json | 100.0 | missing | missing | missing | |
| 11414 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | InJulia | 1SHOT | true | true | 5 | 20231227_210626__603 | 5 | 0.0020551 | 5.15969 | 5 | [72, 230] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__InJulia__1SHOT__20231227_210626__603.json | 100.0 | missing | missing | missing | |
| 11415 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium--optim | InJulia | 1SHOT | true | true | 5 | 20231215_202300__378 | 5 | 0.0 | 14.0324 | 5 | [72, 163] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__InJulia__1SHOT__20231215_202300__378.json | 100.0 | 0.9 | missing | 0.3 | |
| 11416 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_210148__740 | 1 | 0.00187725 | 6.48533 | 0 | [111, 195] | 0.10.0-DEV | 5 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231213_210148__740.json | 55.0 | missing | missing | missing | |
| 11417 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_231342__641 | 5 | 0.00180444 | 4.31748 | 5 | [111, 186] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_231342__641.json | 100.0 | missing | missing | missing | |
| 11418 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_231346__401 | 5 | 0.00173163 | 3.97186 | 5 | [111, 177] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_231346__401.json | 100.0 | missing | missing | missing | |
| 11419 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_210605__239 | 5 | 0.00199051 | 13.9255 | 5 | [111, 209] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_210605__239.json | 100.0 | missing | missing | missing | |
| 11420 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_210613__700 | 5 | 0.00224939 | 7.67259 | 5 | [111, 241] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_210613__700.json | 100.0 | missing | missing | missing | |
| 11421 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_202246__854 | 5 | 0.0 | 13.5079 | 5 | [111, 151] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231215_202246__854.json | 100.0 | 0.9 | missing | 0.3 | |
| 11422 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_210142__719 | 5 | 0.00460121 | 40.152 | 5 | [209, 499] | 0.10.0-DEV | 5 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231213_210142__719.json | 100.0 | missing | missing | missing | |
| 11423 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_231326__611 | 5 | 0.00413199 | 9.95493 | 5 | [209, 441] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_231326__611.json | 100.0 | missing | missing | missing | |
| 11424 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_231337__804 | 5 | 0.00437469 | 10.6045 | 5 | [209, 471] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_231337__804.json | 100.0 | missing | missing | missing | |
| 11425 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_210536__723 | 5 | 0.00481964 | 13.7868 | 5 | [209, 526] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_210536__723.json | 100.0 | missing | missing | missing | |
| 11426 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_210551__878 | 5 | 0.00425334 | 14.9975 | 5 | [209, 456] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_210551__878.json | 100.0 | missing | missing | missing | |
| 11427 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_202232__403 | 5 | 0.0 | 33.7569 | 5 | [209, 435] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231215_202232__403.json | 100.0 | 0.9 | missing | 0.3 | |
| 11428 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_210219__628 | 5 | 0.00422962 | 8.99407 | 5 | [374, 398] | 0.10.0-DEV | 5 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231213_210219__628.json | 100.0 | missing | missing | missing | |
| 11429 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_231454__137 | 5 | 0.00439142 | 9.55327 | 5 | [374, 418] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_231454__137.json | 100.0 | missing | missing | missing | |
| 11430 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_231508__585 | 1 | 0.00591234 | 13.849 | 5 | [374, 606] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_231508__585.json | 80.0 | missing | missing | missing | |
| 11431 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_210735__591 | 5 | 0.00635729 | 43.4189 | 5 | [374, 661] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_210735__591.json | 100.0 | missing | missing | missing | |
| 11432 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_210803__262 | 5 | 0.00528941 | 27.4566 | 5 | [374, 529] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_210803__262.json | 100.0 | missing | missing | missing | |
| 11433 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_202352__150 | 5 | 0.0 | 18.1137 | 5 | [374, 432] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231215_202352__150.json | 100.0 | 0.9 | missing | 0.3 | |
| 11434 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_210210__421 | 5 | 0.00428624 | 9.26803 | 5 | [371, 406] | 0.10.0-DEV | 5 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231213_210210__421.json | 100.0 | missing | missing | missing | |
| 11435 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_231440__852 | 5 | 0.00469074 | 13.3095 | 5 | [371, 456] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_231440__852.json | 100.0 | missing | missing | missing | |
| 11436 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_231444__981 | 5 | 0.00248217 | 4.19131 | 5 | [371, 183] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_231444__981.json | 100.0 | missing | missing | missing | |
| 11437 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_210634__980 | 5 | 0.00361477 | 7.39584 | 5 | [371, 323] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_210634__980.json | 100.0 | missing | missing | missing | |
| 11438 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_210652__170 | 5 | 0.00617121 | 18.0811 | 5 | [371, 639] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_210652__170.json | 100.0 | missing | missing | missing | |
| 11439 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-medium--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_202334__115 | 5 | 0.0 | 11.4097 | 5 | [371, 502] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231215_202334__115.json | 100.0 | 0.9 | missing | 0.3 | |
| 11440 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | AsIs | 1SHOT | true | true | 5 | 20231213_210051__217 | 5 | 0.000616296 | 4.03178 | 5 | [68, 295] | 0.10.0-DEV | 5 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__AsIs__1SHOT__20231213_210051__217.json | 100.0 | missing | missing | missing | |
| 11441 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | AsIs | 1SHOT | true | true | 5 | 20231225_231250__382 | 5 | 0.000391256 | 2.54839 | 5 | [68, 179] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__AsIs__1SHOT__20231225_231250__382.json | 100.0 | missing | missing | missing | |
| 11442 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | AsIs | 1SHOT | false | false | 5 | 20231225_231253__750 | 0 | 0.000443636 | 2.90947 | 0 | [68, 206] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__AsIs__1SHOT__20231225_231253__750.json | 0.0 | missing | missing | missing | |
| 11443 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small--optim | AsIs | 1SHOT | false | false | 5 | 20231215_202145__222 | 0 | 0.0 | 4.56742 | 0 | [68, 345] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__AsIs__1SHOT__20231215_202145__222.json | 0.0 | 0.9 | missing | 0.3 | |
| 11444 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231213_210047__862 | 5 | 0.000558097 | 3.61669 | 5 | [71, 264] | 0.10.0-DEV | 5 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__InJulia__1SHOT__20231213_210047__862.json | 100.0 | missing | missing | missing | |
| 11445 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231225_231244__606 | 1 | 0.000872377 | 5.67363 | 0 | [71, 426] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__InJulia__1SHOT__20231225_231244__606.json | 55.0 | missing | missing | missing | |
| 11446 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231225_231247__976 | 1 | 0.000406777 | 2.62831 | 0 | [71, 186] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__InJulia__1SHOT__20231225_231247__976.json | 55.0 | missing | missing | missing | |
| 11447 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231227_210453__396 | 1 | 0.000451397 | 2.86016 | 0 | [71, 209] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__InJulia__1SHOT__20231227_210453__396.json | 55.0 | missing | missing | missing | |
| 11448 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | InJulia | 1SHOT | true | true | 5 | 20231227_210458__764 | 5 | 0.000680317 | 4.81605 | 5 | [71, 327] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__InJulia__1SHOT__20231227_210458__764.json | 100.0 | missing | missing | missing | |
| 11449 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small--optim | InJulia | 1SHOT | true | true | 5 | 20231215_202140__112 | 5 | 0.0 | 3.24718 | 5 | [71, 241] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__InJulia__1SHOT__20231215_202140__112.json | 100.0 | 0.9 | missing | 0.3 | |
| 11450 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_210043__188 | 5 | 0.000355704 | 2.08451 | 5 | [112, 146] | 0.10.0-DEV | 5 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231213_210043__188.json | 100.0 | missing | missing | missing | |
| 11451 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_231237__293 | 1 | 0.000382864 | 2.22537 | 0 | [112, 160] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_231237__293.json | 55.0 | missing | missing | missing | |
| 11452 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_231239__791 | 5 | 0.000305264 | 1.79226 | 5 | [112, 120] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_231239__791.json | 100.0 | missing | missing | missing | |
| 11453 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_210448__573 | 5 | 0.000373164 | 2.1455 | 5 | [112, 155] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_210448__573.json | 100.0 | missing | missing | missing | |
| 11454 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_210450__959 | 5 | 0.000413904 | 2.4556 | 5 | [112, 176] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_210450__959.json | 100.0 | missing | missing | missing | |
| 11455 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_202137__144 | 5 | 0.0 | 2.3327 | 5 | [112, 166] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231215_202137__144.json | 100.0 | 0.9 | missing | 0.3 | |
| 11456 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_210041__230 | 1 | 0.00099917 | 5.96963 | 0 | [210, 445] | 0.10.0-DEV | 5 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231213_210041__230.json | 55.0 | missing | missing | missing | |
| 11457 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_231230__505 | 5 | 0.00074697 | 4.40196 | 5 | [210, 315] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_231230__505.json | 100.0 | missing | missing | missing | |
| 11458 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_231234__358 | 1 | 0.00078965 | 4.59712 | 0 | [210, 337] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_231234__358.json | 55.0 | missing | missing | missing | |
| 11459 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_210440__178 | 5 | 0.00075473 | 4.37153 | 5 | [210, 319] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_210440__178.json | 100.0 | missing | missing | missing | |
| 11460 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_210445__585 | 1 | 0.00095455 | 5.65144 | 0 | [210, 422] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_210445__585.json | 55.0 | missing | missing | missing | |
| 11461 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_202134__162 | 5 | 0.0 | 4.50493 | 5 | [210, 339] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231215_202134__162.json | 100.0 | 0.9 | missing | 0.3 | |
| 11462 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_210101__647 | 1 | 0.00113309 | 6.23545 | 0 | [378, 458] | 0.10.0-DEV | 5 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231213_210101__647.json | 55.0 | missing | missing | missing | |
| 11463 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_231312__849 | 5 | 0.00114667 | 6.35196 | 5 | [378, 465] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_231312__849.json | 100.0 | missing | missing | missing | |
| 11464 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_231316__295 | 1 | 0.000836266 | 4.21096 | 0 | [378, 305] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_231316__295.json | 55.0 | missing | missing | missing | |
| 11465 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_210517__143 | 1 | 0.00107877 | 5.87426 | 0 | [378, 430] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_210517__143.json | 55.0 | missing | missing | missing | |
| 11466 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_210522__907 | 1 | 0.000960426 | 5.03448 | 0 | [378, 369] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_210522__907.json | 55.0 | missing | missing | missing | |
| 11467 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_202158__253 | 5 | 0.0 | 6.72583 | 5 | [378, 499] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231215_202158__253.json | 100.0 | 0.9 | missing | 0.3 | |
| 11468 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_210055__155 | 5 | 0.000827212 | 4.12429 | 5 | [376, 301] | 0.10.0-DEV | 5 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231213_210055__155.json | 100.0 | missing | missing | missing | |
| 11469 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_231258__884 | 5 | 0.00101539 | 5.4349 | 5 | [376, 398] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_231258__884.json | 100.0 | missing | missing | missing | |
| 11470 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_231306__125 | 1 | 0.00126953 | 7.17317 | 0 | [376, 529] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_231306__125.json | 55.0 | missing | missing | missing | |
| 11471 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_210503__524 | 1 | 0.000984352 | 5.18695 | 0 | [376, 382] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_210503__524.json | 55.0 | missing | missing | missing | |
| 11472 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_210511__927 | 5 | 0.00139951 | 8.07311 | 5 | [376, 596] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_210511__927.json | 100.0 | missing | missing | missing | |
| 11473 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-small--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_202151__817 | 1 | 0.0 | 6.72439 | 0 | [376, 501] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231215_202151__817.json | 55.0 | 0.9 | missing | 0.3 | |
| 11474 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231213_210023__622 | 0 | 0.000162634 | 4.60771 | 0 | [68, 338] | 0.10.0-DEV | 5 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__AsIs__1SHOT__20231213_210023__622.json | 0.0 | missing | missing | missing | |
| 11475 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231225_231203__138 | 0 | 0.000255499 | 4.79364 | 0 | [68, 543] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__AsIs__1SHOT__20231225_231203__138.json | 0.0 | missing | missing | missing | |
| 11476 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | AsIs | 1SHOT | false | false | 5 | 20231225_231207__883 | 0 | 0.000246439 | 4.68711 | 0 | [68, 523] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__AsIs__1SHOT__20231225_231207__883.json | 0.0 | missing | missing | missing | |
| 11477 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny--optim | AsIs | 1SHOT | false | false | 5 | 20231215_202123__342 | 0 | 0.0 | 2.93386 | 0 | [68, 351] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__AsIs__1SHOT__20231215_202123__342.json | 0.0 | 0.9 | missing | 0.3 | |
| 11478 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231213_210018__858 | 1 | 9.7369e-5 | 2.36601 | 0 | [71, 193] | 0.10.0-DEV | 5 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__InJulia__1SHOT__20231213_210018__858.json | 55.0 | missing | missing | missing | |
| 11479 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231225_231152__151 | 5 | 0.000110959 | 1.97546 | 5 | [71, 223] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__InJulia__1SHOT__20231225_231152__151.json | 100.0 | missing | missing | missing | |
| 11480 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231225_231158__794 | 1 | 0.000307561 | 5.90948 | 0 | [71, 657] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__InJulia__1SHOT__20231225_231158__794.json | 55.0 | missing | missing | missing | |
| 11481 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231227_210421__183 | 1 | 8.9215e-5 | 1.61977 | 0 | [71, 175] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__InJulia__1SHOT__20231227_210421__183.json | 55.0 | missing | missing | missing | |
| 11482 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | InJulia | 1SHOT | true | true | 5 | 20231227_210423__776 | 1 | 9.3745e-5 | 1.74599 | 0 | [71, 185] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__InJulia__1SHOT__20231227_210423__776.json | 55.0 | missing | missing | missing | |
| 11483 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny--optim | InJulia | 1SHOT | true | true | 5 | 20231215_202120__721 | 1 | 0.0 | 1.91488 | 0 | [71, 229] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__InJulia__1SHOT__20231215_202120__721.json | 55.0 | 0.9 | missing | 0.3 | |
| 11484 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_210016__542 | 1 | 4.5578e-5 | 1.44525 | 0 | [112, 66] | 0.10.0-DEV | 5 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231213_210016__542.json | 55.0 | missing | missing | missing | |
| 11485 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_231149__892 | 1 | 4.6937e-5 | 0.770383 | 0 | [112, 69] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_231149__892.json | 55.0 | missing | missing | missing | |
| 11486 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_231150__870 | 1 | 4.6937e-5 | 0.784066 | 0 | [112, 69] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_231150__870.json | 55.0 | missing | missing | missing | |
| 11487 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_210419__555 | 1 | 5.5091e-5 | 0.882659 | 0 | [112, 87] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_210419__555.json | 55.0 | missing | missing | missing | |
| 11488 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_210420__190 | 1 | 4.6937e-5 | 0.729672 | 0 | [112, 69] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_210420__190.json | 55.0 | missing | missing | missing | |
| 11489 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny--optim | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231215_202118__676 | 1 | 0.0 | 0.721436 | 0 | [112, 69] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231215_202118__676.json | 55.0 | 0.9 | missing | 0.3 | |
| 11490 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_210014__264 | 1 | 0.000245028 | 9.50681 | 0 | [210, 476] | 0.10.0-DEV | 5 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231213_210014__264.json | 55.0 | missing | missing | missing | |
| 11491 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_231143__280 | 1 | 0.000219207 | 8.60339 | 0 | [210, 419] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_231143__280.json | 55.0 | missing | missing | missing | |
| 11492 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_231148__687 | 1 | 0.000250464 | 4.57701 | 0 | [210, 488] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_231148__687.json | 55.0 | missing | missing | missing | |
| 11493 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_210415__265 | 5 | 0.000232344 | 11.0389 | 5 | [210, 448] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_210415__265.json | 100.0 | missing | missing | missing | |
| 11494 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_210418__896 | 1 | 0.000171189 | 2.92539 | 0 | [210, 313] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_210418__896.json | 55.0 | missing | missing | missing | |
| 11495 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny--optim | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231215_202117__426 | 1 | 0.0 | 5.07432 | 0 | [210, 325] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231215_202117__426.json | 55.0 | 0.9 | missing | 0.3 | |
| 11496 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_210035__337 | 5 | 0.000224154 | 6.44282 | 5 | [378, 378] | 0.10.0-DEV | 5 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231213_210035__337.json | 100.0 | missing | missing | missing | |
| 11497 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_231221__699 | 1 | 0.000191991 | 2.95268 | 0 | [378, 307] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_231221__699.json | 55.0 | missing | missing | missing | |
| 11498 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_231225__686 | 1 | 0.000248163 | 4.03023 | 5 | [378, 431] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_231225__686.json | 80.0 | missing | missing | missing | |
| 11499 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_210433__971 | 1 | 0.000168888 | 2.46481 | 0 | [378, 256] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_210433__971.json | 55.0 | missing | missing | missing | |
| 11500 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_210435__968 | 1 | 0.000149409 | 2.21254 | 0 | [378, 213] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_210435__968.json | 55.0 | missing | missing | missing | |
| 11501 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny--optim | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231215_202130__454 | 1 | 0.0 | 2.95952 | 0 | [378, 345] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231215_202130__454.json | 55.0 | 0.9 | missing | 0.3 | |
| 11502 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_210028__245 | 0 | 0.000223421 | 5.15213 | 0 | [376, 377] | 0.10.0-DEV | 5 | 1.0 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231213_210028__245.json | 0.0 | missing | missing | missing | |
| 11503 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_231212__842 | 1 | 0.000264644 | 4.3979 | 0 | [376, 468] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_231212__842.json | 55.0 | missing | missing | missing | |
| 11504 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_231218__146 | 1 | 0.000345278 | 6.01362 | 0 | [376, 646] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_231218__146.json | 55.0 | missing | missing | missing | |
| 11505 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_210427__742 | 1 | 0.000244712 | 3.94894 | 5 | [376, 424] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_210427__742.json | 80.0 | missing | missing | missing | |
| 11506 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_210430__180 | 5 | 0.000194882 | 2.98068 | 5 | [376, 314] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_210430__180.json | 100.0 | missing | missing | missing | |
| 11507 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral-tiny--optim | JuliaRecapTask | 1SHOT | true | true | 5 | 20231215_202126__998 | 1 | 0.0 | 3.4798 | 0 | [376, 295] | 0.10.0-DEV | 5 | 1.0 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231215_202126__998.json | 55.0 | 0.9 | missing | 0.3 | |
| 11508 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_092902__522 | 0 | 0.0 | 12.9487 | 0 | [67, 327] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_092902__522.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11509 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_092916__478 | 0 | 0.0 | 13.7532 | 0 | [67, 348] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_092916__478.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11510 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_092841__823 | 1 | 0.0 | 4.74709 | 0 | [70, 113] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_092841__823.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11511 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_092849__864 | 1 | 0.0 | 7.48931 | 0 | [70, 185] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_092849__864.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11512 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_090850__294 | 1 | 0.0 | 11.4487 | 0 | [70, 286] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_090850__294.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11513 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_092829__485 | 1 | 0.0 | 3.05739 | 0 | [111, 63] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_092829__485.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11514 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_092836__366 | 1 | 0.0 | 7.6875 | 0 | [111, 185] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_092836__366.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11515 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_090839__595 | 0 | 0.0 | 3.33001 | 0 | [111, 70] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_090839__595.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11516 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_092809__493 | 1 | 0.0 | 22.5607 | 0 | [209, 408] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_092809__493.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11517 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_092826__333 | 1 | 0.0 | 15.9242 | 0 | [209, 381] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_092826__333.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11518 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_090835__141 | 1 | 0.0 | 12.9802 | 0 | [209, 168] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_090835__141.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11519 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_093015__369 | 1 | 0.0 | 17.6342 | 0 | [378, 395] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_093015__369.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11520 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_093023__245 | 1 | 0.0 | 7.71722 | 0 | [378, 147] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_093023__245.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11521 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_090927__316 | 1 | 0.0 | 26.437 | 0 | [378, 605] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_090927__316.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11522 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_092932__473 | 1 | 0.0 | 16.3475 | 0 | [376, 364] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_092932__473.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11523 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_092957__406 | 1 | 0.0 | 25.0085 | 0 | [376, 576] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_092957__406.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11524 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_090900__706 | 0 | 0.0 | 9.97851 | 0 | [376, 203] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_090900__706.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11525 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_004250__501 | 1 | 0.0 | 10.8377 | 0 | [69, 343] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_004250__501.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11526 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_004256__385 | 1 | 0.0 | 6.24295 | 0 | [69, 193] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_004256__385.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11527 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_004304__124 | 5 | 0.0 | 7.63896 | 5 | [69, 238] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_004304__124.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11528 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_004311__900 | 5 | 0.0 | 6.99294 | 5 | [69, 218] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_004311__900.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11529 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_004324__459 | 1 | 0.0 | 12.9356 | 0 | [69, 410] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_004324__459.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11530 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_004228__545 | 1 | 0.0 | 2.96869 | 0 | [110, 78] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_004228__545.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11531 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_004230__386 | 1 | 0.0 | 2.64896 | 0 | [110, 69] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_004230__386.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11532 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_004233__884 | 1 | 0.0 | 2.75102 | 0 | [110, 72] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_004233__884.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11533 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_004236__671 | 5 | 0.0 | 2.80437 | 5 | [110, 74] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_004236__671.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11534 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_004239__628 | 5 | 0.0 | 3.09971 | 5 | [110, 84] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_004239__628.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11535 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231228_004143__499 | 0 | 0.0 | 22.8254 | 0 | [208, 663] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_004143__499.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11536 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_004152__522 | 1 | 0.0 | 9.28233 | 0 | [208, 269] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_004152__522.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11537 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_004201__702 | 5 | 0.0 | 8.65301 | 5 | [208, 249] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_004201__702.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11538 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_004212__292 | 1 | 0.0 | 11.664 | 0 | [208, 345] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_004212__292.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11539 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_004224__345 | 1 | 0.0 | 11.9189 | 0 | [208, 353] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_004224__345.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11540 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_004440__160 | 5 | 0.0 | 8.98066 | 5 | [377, 231] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_004440__160.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11541 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_004451__781 | 1 | 0.0 | 11.2965 | 0 | [377, 303] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_004451__781.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11542 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231228_004503__294 | 0 | 0.0 | 11.4483 | 0 | [377, 308] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_004503__294.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11543 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_004519__135 | 1 | 0.0 | 16.3821 | 0 | [377, 459] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_004519__135.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11544 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_004541__562 | 1 | 0.0 | 21.938 | 5 | [377, 625] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_004541__562.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11545 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_004339__494 | 1 | 0.0 | 15.0596 | 0 | [375, 419] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_004339__494.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11546 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_004351__289 | 1 | 0.0 | 11.3438 | 0 | [375, 305] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_004351__289.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11547 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_004408__815 | 1 | 0.0 | 16.5448 | 0 | [375, 464] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_004408__815.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11548 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_004421__661 | 1 | 0.0 | 13.0064 | 0 | [375, 357] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_004421__661.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11549 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_004431__385 | 1 | 0.0 | 10.0879 | 0 | [375, 266] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_004431__385.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11550 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_004736__317 | 1 | 0.0 | 8.15433 | 0 | [69, 200] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_004736__317.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11551 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_004747__137 | 1 | 0.0 | 10.2072 | 0 | [69, 253] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_004747__137.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11552 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_004756__352 | 1 | 0.0 | 9.43706 | 0 | [69, 233] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_004756__352.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11553 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_004805__996 | 1 | 0.0 | 8.28405 | 0 | [69, 203] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_004805__996.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11554 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_004812__366 | 1 | 0.0 | 7.45734 | 0 | [69, 182] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_004812__366.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11555 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_004710__187 | 1 | 0.0 | 6.89194 | 0 | [110, 162] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_004710__187.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11556 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_004713__970 | 1 | 0.0 | 3.38911 | 0 | [110, 71] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_004713__970.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11557 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_004717__876 | 1 | 0.0 | 3.77885 | 0 | [110, 81] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_004717__876.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11558 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_004724__385 | 1 | 0.0 | 6.84671 | 0 | [110, 161] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_004724__385.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11559 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_004728__853 | 1 | 0.0 | 3.89385 | 0 | [110, 84] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_004728__853.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11560 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_004548__662 | 1 | 0.0 | 7.20853 | 0 | [208, 135] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_004548__662.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11561 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_004606__966 | 1 | 0.0 | 17.5558 | 0 | [208, 417] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_004606__966.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11562 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_004622__133 | 1 | 0.0 | 15.6303 | 0 | [208, 369] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_004622__133.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11563 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_004645__970 | 1 | 0.0 | 23.1409 | 0 | [208, 555] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_004645__970.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11564 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231228_004703__183 | 0 | 0.0 | 17.8331 | 0 | [208, 424] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_004703__183.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11565 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_005002__992 | 1 | 0.0 | 18.105 | 0 | [377, 402] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_005002__992.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11566 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_005014__877 | 1 | 0.0 | 12.2009 | 0 | [377, 257] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_005014__877.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11567 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_005026__187 | 1 | 0.0 | 11.7697 | 0 | [377, 246] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_005026__187.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11568 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_005037__250 | 1 | 0.0 | 10.9194 | 0 | [377, 225] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_005037__250.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11569 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_005051__288 | 5 | 0.0 | 13.4944 | 5 | [377, 288] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_005051__288.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11570 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_004825__605 | 1 | 0.0 | 12.3583 | 0 | [375, 261] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_004825__605.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11571 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231228_004842__957 | 0 | 0.0 | 17.487 | 0 | [375, 387] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_004842__957.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11572 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_004901__240 | 1 | 0.0 | 18.4639 | 0 | [375, 411] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_004901__240.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11573 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_004925__926 | 1 | 0.0 | 24.0378 | 5 | [375, 546] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_004925__926.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11574 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_004944__463 | 1 | 0.0 | 19.0711 | 0 | [375, 426] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_004944__463.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11575 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231226_124221__649 | 0 | 0.0 | 21.3836 | 0 | [66, 389] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_124221__649.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11576 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231226_124238__148 | 0 | 0.0 | 17.0188 | 0 | [66, 307] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_124238__148.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11577 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_124148__513 | 1 | 0.0 | 9.98767 | 0 | [69, 180] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_124148__513.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11578 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_124159__808 | 1 | 0.0 | 11.0499 | 0 | [69, 197] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_124159__808.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11579 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_091234__230 | 1 | 0.0 | 11.6593 | 0 | [69, 211] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231227_091234__230.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11580 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_124133__449 | 1 | 0.0 | 4.07262 | 0 | [110, 64] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_124133__449.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11581 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_124138__511 | 5 | 0.0 | 5.01771 | 5 | [110, 82] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_124138__511.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11582 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_091222__925 | 1 | 0.0 | 4.25023 | 0 | [110, 67] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_091222__925.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11583 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_124121__136 | 1 | 0.0 | 25.329 | 0 | [208, 448] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_124121__136.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11584 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_124129__750 | 0 | 0.0 | 7.56331 | 0 | [208, 120] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_124129__750.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11585 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_091218__856 | 1 | 0.0 | 31.0774 | 0 | [208, 394] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_091218__856.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11586 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_124344__597 | 1 | 0.0 | 19.0455 | 0 | [377, 310] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_124344__597.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11587 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231226_124431__806 | 0 | 0.0 | 47.0613 | 0 | [377, 735] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_124431__806.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11588 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_091330__334 | 5 | 0.0 | 31.1692 | 5 | [377, 531] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_091330__334.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11589 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_124249__480 | 1 | 0.0 | 11.1745 | 0 | [375, 167] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_124249__480.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11590 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_124324__837 | 1 | 0.0 | 35.4075 | 5 | [375, 598] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_124324__837.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11591 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_091259__677 | 1 | 0.0 | 25.0854 | 0 | [375, 421] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_091259__677.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11592 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_091715__495 | 1 | 0.0 | 56.3707 | 0 | [75, 333] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_091715__495.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11593 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_132525__325 | 0 | 0.0 | 50.038 | 0 | [75, 295] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_132525__325.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11594 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_132559__983 | 5 | 0.0 | 33.7019 | 5 | [75, 195] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_132559__983.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11595 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_132637__766 | 5 | 0.0 | 37.766 | 5 | [75, 220] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_132637__766.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11596 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_091619__233 | 1 | 0.0 | 53.8678 | 0 | [114, 312] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_091619__233.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11597 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_132350__315 | 1 | 0.0 | 46.2178 | 0 | [114, 266] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_132350__315.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11598 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_132406__976 | 1 | 0.0 | 16.2475 | 0 | [114, 82] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_132406__976.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11599 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_132435__218 | 1 | 0.0 | 28.8871 | 0 | [114, 160] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_132435__218.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11600 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_091525__841 | 5 | 0.0 | 69.7117 | 5 | [210, 251] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_091525__841.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11601 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_132208__322 | 1 | 0.0 | 72.815 | 5 | [210, 380] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_132208__322.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11602 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_132231__693 | 0 | 0.0 | 22.2539 | 0 | [210, 104] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_132231__693.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11603 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_132304__857 | 0 | 0.0 | 32.5602 | 0 | [210, 167] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_132304__857.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11604 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_091914__163 | 1 | 0.0 | 70.3283 | 1 | [388, 358] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_091914__163.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11605 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_133009__269 | 0 | 0.0 | 11.037 | 0 | [388, 5] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_133009__269.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11606 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_133111__381 | 1 | 0.0 | 61.7347 | 0 | [388, 308] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_133111__381.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11607 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_133225__108 | 1 | 0.0 | 74.7223 | 0 | [388, 384] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_133225__108.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11608 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_091803__510 | 1 | 0.0 | 47.9829 | 5 | [386, 226] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_091803__510.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11609 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_132738__132 | 5 | 0.0 | 61.3755 | 5 | [386, 306] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_132738__132.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11610 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_132844__675 | 0 | 0.0 | 65.9628 | 0 | [386, 333] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_132844__675.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11611 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_132958__310 | 1 | 0.0 | 73.2895 | 0 | [386, 376] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_132958__310.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11612 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_093139__102 | 0 | 0.0 | 17.2295 | 0 | [75, 436] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231225_093139__102.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11613 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 5 | 20231225_093148__804 | 0 | 0.0 | 9.34367 | 0 | [75, 233] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231225_093148__804.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11614 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_093115__387 | 4 | 0.0 | 8.51272 | 4 | [78, 211] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_093115__387.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11615 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_093121__995 | 5 | 0.0 | 6.33191 | 5 | [78, 154] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_093121__995.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11616 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_090954__142 | 5 | 0.0 | 5.11181 | 5 | [78, 121] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231227_090954__142.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11617 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_093100__665 | 5 | 0.0 | 6.52583 | 5 | [119, 154] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_093100__665.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11618 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_093106__449 | 1 | 0.0 | 6.33345 | 0 | [119, 149] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_093106__449.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11619 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_090949__828 | 5 | 0.0 | 3.19025 | 5 | [119, 66] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_090949__828.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11620 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_093042__268 | 5 | 0.0 | 18.752 | 5 | [217, 296] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_093042__268.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11621 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_093053__827 | 5 | 0.0 | 11.5098 | 5 | [217, 268] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_093053__827.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11622 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_090946__638 | 4 | 0.0 | 18.9196 | 4 | [217, 303] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_090946__638.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11623 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_093225__856 | 5 | 0.0 | 10.3264 | 5 | [386, 209] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_093225__856.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11624 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_093230__435 | 1 | 0.0 | 5.18387 | 0 | [386, 78] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_093230__435.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11625 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_091023__931 | 1 | 0.0 | 13.8186 | 0 | [386, 294] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_091023__931.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11626 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_093207__552 | 4 | 0.0 | 19.3037 | 4 | [384, 435] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_093207__552.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11627 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_093214__746 | 5 | 0.0 | 7.1739 | 5 | [384, 133] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_093214__746.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11628 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_091009__670 | 1 | 0.0 | 14.7093 | 0 | [384, 320] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_091009__670.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11629 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231214_084508__666 | 0 | 0.0 | 15.7221 | 0 | [57, 469] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231214_084508__666.json | 0.0 | missing | missing | missing | |
| 11630 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 5 | 20231225_084346__371 | 0 | 0.0 | 12.4757 | 0 | [73, 407] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231225_084346__371.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11631 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231214_084452__103 | 1 | 0.0 | 11.7372 | 0 | [74, 349] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231214_084452__103.json | 55.0 | missing | missing | missing | |
| 11632 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231225_084329__968 | 1 | 0.0 | 12.0158 | 0 | [76, 391] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_084329__968.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11633 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231225_084334__871 | 1 | 0.0 | 4.56678 | 0 | [76, 141] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_084334__871.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11634 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231227_084805__904 | 5 | 0.0 | 5.7173 | 5 | [76, 179] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231227_084805__904.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11635 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_084440__347 | 1 | 0.0 | 8.58027 | 0 | [103, 245] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231214_084440__347.json | 55.0 | missing | missing | missing | |
| 11636 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_084311__311 | 1 | 0.0 | 4.38059 | 0 | [117, 130] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_084311__311.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11637 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_084317__206 | 2 | 0.0 | 5.40921 | 5 | [117, 165] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_084317__206.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11638 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_084800__741 | 1 | 0.0 | 4.27726 | 0 | [117, 125] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231227_084800__741.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11639 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_084431__778 | 1 | 0.0 | 12.1676 | 0 | [201, 317] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231214_084431__778.json | 55.0 | missing | missing | missing | |
| 11640 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_084307__798 | 0 | 0.0 | 13.8897 | 0 | [215, 426] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_084307__798.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11641 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_084755__613 | 5 | 0.0 | 22.0678 | 5 | [215, 523] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231227_084755__613.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11642 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_084542__317 | 0 | 0.0 | 11.4392 | 0 | [11, 318] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231214_084542__317.json | 0.0 | missing | missing | missing | |
| 11643 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_084431__936 | 1 | 0.0 | 14.3154 | 0 | [384, 407] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_084431__936.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11644 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_084450__105 | 0 | 0.0 | 19.1579 | 0 | [384, 557] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_084450__105.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11645 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_084832__241 | 5 | 0.0 | 14.8027 | 5 | [384, 418] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231227_084832__241.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11646 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_084531__491 | 1 | 0.0 | 22.7711 | 0 | [374, 532] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231214_084531__491.json | 55.0 | missing | missing | missing | |
| 11647 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_084409__381 | 1 | 0.0 | 10.0754 | 0 | [382, 273] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_084409__381.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11648 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_084416__990 | 1 | 0.0 | 6.74894 | 0 | [382, 166] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_084416__990.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11649 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_084817__988 | 0 | 0.0 | 11.7032 | 0 | [382, 321] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231227_084817__988.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11650 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231214_085656__451 | 0 | 0.0 | 10.1254 | 0 | [57, 307] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__AsIs__1SHOT__20231214_085656__451.json | 0.0 | missing | missing | missing | |
| 11651 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231225_090520__337 | 0 | 0.0 | 17.2963 | 0 | [74, 316] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__AsIs__1SHOT__20231225_090520__337.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11652 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | AsIs | 1SHOT | false | false | 5 | 20231225_090523__827 | 0 | 0.0 | 2.756 | 0 | [74, 38] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__AsIs__1SHOT__20231225_090523__827.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11653 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231214_085646__465 | 1 | 0.0 | 10.6528 | 0 | [74, 318] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__InJulia__1SHOT__20231214_085646__465.json | 55.0 | missing | missing | missing | |
| 11654 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231225_090453__406 | 5 | 0.0 | 14.3116 | 5 | [77, 260] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__InJulia__1SHOT__20231225_090453__406.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11655 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231225_090503__152 | 5 | 0.0 | 9.37374 | 5 | [77, 166] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__InJulia__1SHOT__20231225_090503__152.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11656 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231227_085925__624 | 5 | 0.0 | 9.8162 | 5 | [77, 173] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__InJulia__1SHOT__20231227_085925__624.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11657 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_085635__500 | 1 | 0.0 | 8.31986 | 0 | [103, 236] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231214_085635__500.json | 55.0 | missing | missing | missing | |
| 11658 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_090436__456 | 0 | 0.0 | 2.92999 | 0 | [116, 37] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_090436__456.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11659 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_090439__313 | 0 | 0.0 | 3.32145 | 0 | [116, 44] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_090439__313.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11660 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_085915__242 | 2 | 0.0 | 14.4734 | 1 | [116, 255] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231227_085915__242.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11661 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_085626__932 | 1 | 0.0 | 20.8274 | 0 | [201, 555] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231214_085626__932.json | 55.0 | missing | missing | missing | |
| 11662 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_090408__658 | 0 | 0.0 | 16.7117 | 0 | [214, 106] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_090408__658.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11663 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_090433__386 | 4 | 0.0 | 25.1049 | 4 | [214, 434] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_090433__386.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11664 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_085900__133 | 0 | 0.0 | 13.6847 | 0 | [214, 52] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231227_085900__133.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11665 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_085733__701 | 1 | 0.0 | 16.0575 | 0 | [11, 440] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231214_085733__701.json | 55.0 | missing | missing | missing | |
| 11666 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_090648__787 | 1 | 0.0 | 26.2603 | 0 | [380, 420] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_090648__787.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11667 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_090717__115 | 4 | 0.0 | 28.8697 | 4 | [380, 465] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_090717__115.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11668 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_085953__728 | 1 | 0.0 | 16.1163 | 0 | [380, 238] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231227_085953__728.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11669 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_085717__982 | 1 | 0.0 | 21.294 | 0 | [374, 495] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231214_085717__982.json | 55.0 | missing | missing | missing | |
| 11670 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_090544__919 | 0 | 0.0 | 20.8684 | 0 | [377, 325] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_090544__919.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11671 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_090621__804 | 0 | 0.0 | 37.5861 | 0 | [377, 615] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_090621__804.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11672 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_085937__714 | 0 | 0.0 | 12.0577 | 0 | [377, 165] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231227_085937__714.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11673 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231225_093536__459 | 0 | 0.0 | 54.9901 | 0 | [68, 1863] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231225_093536__459.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11674 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 5 | 20231225_093553__426 | 0 | 0.0 | 17.6002 | 0 | [68, 671] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231225_093553__426.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11675 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_093427__311 | 0 | 0.0 | 18.2586 | 0 | [71, 695] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_093427__311.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11676 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_093441__448 | 0 | 0.0 | 13.5181 | 0 | [71, 521] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_093441__448.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11677 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_091115__928 | 0 | 0.0 | 18.5206 | 0 | [71, 697] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231227_091115__928.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11678 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_093334__712 | 0 | 0.0 | 28.6052 | 0 | [108, 1041] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_093334__712.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11679 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_093409__210 | 0 | 0.0 | 34.5877 | 0 | [108, 1235] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_093409__210.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11680 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_091057__515 | 0 | 0.0 | 23.8123 | 0 | [108, 872] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_091057__515.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11681 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_093236__943 | 0 | 0.0 | 5.66503 | 0 | [197, 54] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_093236__943.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11682 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_093305__137 | 0 | 0.0 | 29.5708 | 0 | [197, 1045] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_093305__137.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11683 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_091033__259 | 0 | 0.0 | 10.0513 | 0 | [197, 233] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_091033__259.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11684 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_093610__604 | 0 | 0.0 | 4.21513 | 0 | [360, 117] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_093610__604.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11685 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_093618__259 | 1 | 0.0 | 7.87215 | 0 | [360, 254] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_093618__259.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11686 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_091147__139 | 0 | 0.0 | 23.3238 | 0 | [360, 787] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_091147__139.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11687 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_093602__978 | 1 | 0.0 | 8.89339 | 0 | [357, 291] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_093602__978.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11688 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_093605__408 | 0 | 0.0 | 3.37804 | 0 | [357, 84] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_093605__408.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11689 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_091123__461 | 1 | 0.0 | 8.09956 | 0 | [357, 260] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_091123__461.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11690 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231214_085837__690 | 0 | 0.0 | 17.6668 | 0 | [57, 523] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231214_085837__690.json | 0.0 | missing | missing | missing | |
| 11691 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | AsIs | 1SHOT | true | true | 5 | 20231225_091136__277 | 5 | 0.0 | 49.4919 | 5 | [82, 388] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_091136__277.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11692 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 5 | 20231225_091230__907 | 0 | 0.0 | 54.028 | 0 | [82, 424] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_091230__907.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11693 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231214_085819__418 | 1 | 0.0 | 12.5925 | 0 | [74, 374] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231214_085819__418.json | 55.0 | missing | missing | missing | |
| 11694 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_091025__616 | 5 | 0.0 | 40.2628 | 5 | [85, 314] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_091025__616.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11695 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_091047__232 | 5 | 0.0 | 21.6464 | 5 | [85, 162] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_091047__232.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11696 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231227_090137__387 | 5 | 0.0 | 20.2226 | 5 | [85, 150] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231227_090137__387.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11697 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_085807__811 | 1 | 0.0 | 10.5712 | 0 | [103, 304] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231214_085807__811.json | 55.0 | missing | missing | missing | |
| 11698 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_090919__134 | 1 | 0.0 | 23.8352 | 0 | [124, 175] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_090919__134.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11699 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_090945__494 | 5 | 0.0 | 25.8532 | 5 | [124, 191] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_090945__494.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11700 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_090117__204 | 5 | 0.0 | 33.1074 | 5 | [124, 248] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231227_090117__204.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11701 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_085756__401 | 1 | 0.0 | 22.4553 | 0 | [201, 597] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_085756__401.json | 55.0 | missing | missing | missing | |
| 11702 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_090809__435 | 5 | 0.0 | 51.8628 | 5 | [222, 206] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_090809__435.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11703 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_090855__931 | 5 | 0.0 | 45.9755 | 5 | [222, 335] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_090855__931.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11704 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_090044__977 | 5 | 0.0 | 50.9888 | 5 | [222, 208] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_090044__977.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11705 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_085923__100 | 1 | 0.0 | 15.7539 | 0 | [11, 432] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_085923__100.json | 55.0 | missing | missing | missing | |
| 11706 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_091418__734 | 5 | 0.0 | 27.38 | 5 | [388, 153] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_091418__734.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11707 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_091512__293 | 3 | 0.0 | 53.763 | 4 | [388, 360] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_091512__293.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11708 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_090254__424 | 5 | 0.0 | 33.8749 | 5 | [388, 203] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_090254__424.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11709 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_085908__237 | 1 | 0.0 | 30.6426 | 0 | [374, 726] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231214_085908__237.json | 55.0 | missing | missing | missing | |
| 11710 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_091314__809 | 5 | 0.0 | 43.2596 | 5 | [385, 278] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_091314__809.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11711 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_091351__701 | 5 | 0.0 | 37.2954 | 5 | [385, 231] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_091351__701.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11712 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_090220__890 | 0 | 0.0 | 42.394 | 0 | [385, 270] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231227_090220__890.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11713 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | true | true | 5 | 20231225_092629__516 | 1 | 0.0 | 14.6212 | 0 | [75, 246] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231225_092629__516.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11714 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | true | true | 5 | 20231225_092636__398 | 1 | 0.0 | 7.00732 | 0 | [75, 112] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231225_092636__398.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11715 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_092609__464 | 1 | 0.0 | 16.0192 | 0 | [78, 271] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_092609__464.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11716 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_092614__625 | 5 | 0.0 | 5.10948 | 5 | [78, 78] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_092614__625.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11717 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_090745__881 | 1 | 0.0 | 5.36729 | 0 | [78, 82] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231227_090745__881.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11718 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_092546__569 | 1 | 0.0 | 15.0222 | 0 | [119, 248] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_092546__569.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11719 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_092553__543 | 1 | 0.0 | 7.01194 | 0 | [119, 107] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_092553__543.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11720 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_090740__781 | 5 | 0.0 | 11.8392 | 5 | [119, 191] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_090740__781.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11721 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_092522__952 | 1 | 0.0 | 22.4848 | 0 | [217, 199] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_092522__952.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11722 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_092531__467 | 0 | 0.0 | 8.99421 | 0 | [217, 128] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_092531__467.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11723 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_090728__253 | 0 | 0.0 | 20.108 | 0 | [217, 171] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_090728__253.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11724 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_092728__570 | 1 | 0.0 | 10.2903 | 0 | [386, 123] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_092728__570.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11725 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_092747__145 | 1 | 0.0 | 19.1297 | 0 | [386, 273] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_092747__145.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11726 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_090822__616 | 5 | 0.0 | 17.372 | 5 | [386, 242] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_090822__616.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11727 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_092654__333 | 1 | 0.0 | 18.0566 | 0 | [384, 259] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_092654__333.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11728 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_092717__707 | 1 | 0.0 | 22.6462 | 0 | [384, 336] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_092717__707.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11729 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_090805__233 | 1 | 0.0 | 19.8779 | 0 | [384, 288] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_090805__233.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11730 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231214_085505__607 | 0 | 0.0 | 17.3768 | 0 | [57, 516] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__AsIs__1SHOT__20231214_085505__607.json | 0.0 | missing | missing | missing | |
| 11731 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231225_090322__184 | 0 | 0.0 | 8.26827 | 0 | [78, 470] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_090322__184.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11732 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | AsIs | 1SHOT | false | false | 5 | 20231225_090327__288 | 0 | 0.0 | 5.78495 | 0 | [78, 333] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_090327__288.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11733 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | InJulia | 1SHOT | true | true | 5 | 20231214_085448__704 | 1 | 0.0 | 16.9289 | 0 | [74, 498] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__InJulia__1SHOT__20231214_085448__704.json | 55.0 | missing | missing | missing | |
| 11734 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231225_090307__930 | 0 | 0.0 | 5.82786 | 0 | [81, 334] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_090307__930.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11735 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231225_090313__673 | 0 | 0.0 | 5.98495 | 0 | [81, 344] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_090313__673.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11736 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231227_085834__970 | 0 | 0.0 | 3.67644 | 0 | [81, 207] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__InJulia__1SHOT__20231227_085834__970.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11737 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_085431__459 | 1 | 0.0 | 7.06711 | 0 | [103, 199] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231214_085431__459.json | 55.0 | missing | missing | missing | |
| 11738 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_090300__600 | 1 | 0.0 | 3.48262 | 0 | [118, 193] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_090300__600.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11739 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_090302__705 | 0 | 0.0 | 1.4102 | 0 | [118, 69] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_090302__705.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11740 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_085830__868 | 1 | 0.0 | 1.63907 | 0 | [118, 81] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231227_085830__868.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11741 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_085424__836 | 1 | 0.0 | 17.3931 | 0 | [201, 462] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231214_085424__836.json | 55.0 | missing | missing | missing | |
| 11742 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_090248__367 | 0 | 0.0 | 10.9244 | 0 | [204, 426] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_090248__367.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11743 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_090256__979 | 1 | 0.0 | 8.60965 | 0 | [204, 454] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_090256__979.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11744 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_085828__610 | 1 | 0.0 | 9.59217 | 0 | [204, 361] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231227_085828__610.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11745 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_085605__293 | 0 | 0.0 | 39.8487 | 0 | [11, 1010] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231214_085605__293.json | 0.0 | missing | missing | missing | |
| 11746 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_090347__914 | 1 | 0.0 | 6.35457 | 0 | [368, 291] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_090347__914.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11747 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_090351__219 | 1 | 0.0 | 3.95271 | 0 | [368, 162] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_090351__219.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11748 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_085846__446 | 1 | 0.0 | 5.35738 | 0 | [368, 235] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231227_085846__446.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11749 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_085526__943 | 1 | 0.0 | 20.1678 | 0 | [374, 465] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231214_085526__943.json | 55.0 | missing | missing | missing | |
| 11750 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_090336__144 | 0 | 0.0 | 8.91353 | 0 | [366, 422] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_090336__144.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11751 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_090340__683 | 1 | 0.0 | 3.81081 | 0 | [366, 154] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_090340__683.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11752 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_085841__105 | 0 | 0.0 | 7.37699 | 0 | [366, 339] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231227_085841__105.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11753 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231214_084642__492 | 0 | 0.0 | 7.84663 | 0 | [57, 238] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__AsIs__1SHOT__20231214_084642__492.json | 0.0 | missing | missing | missing | |
| 11754 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231225_084610__661 | 0 | 0.0 | 8.60086 | 0 | [75, 277] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__AsIs__1SHOT__20231225_084610__661.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11755 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | AsIs | 1SHOT | false | false | 5 | 20231225_084618__361 | 0 | 0.0 | 8.00023 | 0 | [75, 257] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__AsIs__1SHOT__20231225_084618__361.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11756 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231214_084634__359 | 1 | 0.0 | 16.8786 | 0 | [74, 497] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__InJulia__1SHOT__20231214_084634__359.json | 55.0 | missing | missing | missing | |
| 11757 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_084553__578 | 1 | 0.0 | 15.1058 | 0 | [78, 491] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_084553__578.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11758 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_084601__632 | 5 | 0.0 | 8.18164 | 5 | [78, 263] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_084601__632.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11759 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231227_084911__780 | 1 | 0.0 | 10.8221 | 0 | [78, 345] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__InJulia__1SHOT__20231227_084911__780.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11760 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_084617__751 | 1 | 0.0 | 13.2075 | 0 | [103, 381] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231214_084617__751.json | 55.0 | missing | missing | missing | |
| 11761 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_084528__421 | 5 | 0.0 | 6.17634 | 5 | [119, 190] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_084528__421.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11762 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_084537__794 | 1 | 0.0 | 9.13268 | 0 | [119, 288] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_084537__794.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11763 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_084900__314 | 1 | 0.0 | 6.86598 | 0 | [119, 211] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231227_084900__314.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11764 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_084604__216 | 1 | 0.0 | 21.903 | 0 | [201, 583] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231214_084604__216.json | 55.0 | missing | missing | missing | |
| 11765 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_084514__949 | 1 | 0.0 | 24.2235 | 0 | [217, 582] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_084514__949.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11766 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_084522__413 | 1 | 0.0 | 7.69081 | 0 | [217, 223] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_084522__413.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11767 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_084853__821 | 1 | 0.0 | 21.0202 | 0 | [217, 483] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231227_084853__821.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11768 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_084756__856 | 0 | 0.0 | 28.9025 | 0 | [11, 760] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231214_084756__856.json | 0.0 | missing | missing | missing | |
| 11769 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_084649__607 | 5 | 0.0 | 12.0454 | 5 | [386, 330] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_084649__607.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11770 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_084659__181 | 5 | 0.0 | 9.24105 | 5 | [386, 241] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_084659__181.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11771 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_084932__155 | 5 | 0.0 | 11.84 | 5 | [386, 321] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231227_084932__155.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11772 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_084727__246 | 1 | 0.0 | 44.94 | 0 | [374, 1054] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231214_084727__246.json | 55.0 | missing | missing | missing | |
| 11773 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_084626__844 | 5 | 0.0 | 8.21717 | 5 | [384, 212] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_084626__844.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11774 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_084637__916 | 5 | 0.0 | 10.7526 | 5 | [384, 294] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_084637__916.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11775 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_084921__444 | 5 | 0.0 | 9.60602 | 5 | [384, 255] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231227_084921__444.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11776 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | AsIs | 1SHOT | false | false | 5 | 20231214_084842__603 | 0 | 0.0 | 12.0498 | 0 | [57, 363] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__AsIs__1SHOT__20231214_084842__603.json | 0.0 | missing | missing | missing | |
| 11777 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | AsIs | 1SHOT | true | true | 5 | 20231225_085231__152 | 1 | 0.0 | 52.4113 | 0 | [72, 397] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__AsIs__1SHOT__20231225_085231__152.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11778 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | AsIs | 1SHOT | true | true | 5 | 20231225_085342__215 | 1 | 0.0 | 71.5525 | 0 | [72, 543] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__AsIs__1SHOT__20231225_085342__215.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11779 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231214_084830__817 | 1 | 0.0 | 10.0761 | 0 | [74, 300] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__InJulia__1SHOT__20231214_084830__817.json | 55.0 | missing | missing | missing | |
| 11780 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231225_085024__650 | 0 | 0.0 | 63.4636 | 0 | [75, 482] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_085024__650.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11781 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231225_085138__758 | 1 | 0.0 | 74.5738 | 0 | [75, 565] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_085138__758.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11782 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231227_085217__462 | 0 | 0.0 | 71.4576 | 0 | [75, 538] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__InJulia__1SHOT__20231227_085217__462.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11783 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_084820__259 | 1 | 0.0 | 7.67446 | 0 | [103, 218] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231214_084820__259.json | 55.0 | missing | missing | missing | |
| 11784 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_084857__606 | 1 | 0.0 | 13.3624 | 0 | [114, 86] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_084857__606.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11785 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_084920__376 | 1 | 0.0 | 22.5813 | 0 | [114, 159] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_084920__376.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11786 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_085105__943 | 1 | 0.0 | 49.4696 | 0 | [114, 365] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231227_085105__943.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11787 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_084812__384 | 1 | 0.0 | 15.9029 | 0 | [201, 422] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231214_084812__384.json | 55.0 | missing | missing | missing | |
| 11788 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_084819__596 | 1 | 0.0 | 80.5485 | 0 | [210, 404] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_084819__596.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11789 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_084844__582 | 1 | 0.0 | 24.1854 | 0 | [210, 154] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_084844__582.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11790 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_085016__934 | 0 | 0.0 | 43.031 | 0 | [210, 126] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231227_085016__934.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11791 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_084928__617 | 1 | 0.0 | 24.7752 | 0 | [11, 659] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231214_084928__617.json | 55.0 | missing | missing | missing | |
| 11792 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_085506__924 | 1 | 0.0 | 19.5945 | 0 | [388, 86] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_085506__924.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11793 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_085622__224 | 1 | 0.0 | 75.6569 | 0 | [388, 505] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_085622__224.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11794 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_085549__170 | 0 | 0.0 | 129.041 | 0 | [388, 879] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231227_085549__170.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11795 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_084903__326 | 1 | 0.0 | 20.8251 | 0 | [374, 481] | 0.10.0-DEV | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231214_084903__326.json | 55.0 | missing | missing | missing | |
| 11796 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_085407__347 | 5 | 0.0 | 24.9873 | 5 | [386, 127] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_085407__347.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11797 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_085447__339 | 1 | 0.0 | 39.2992 | 0 | [386, 236] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_085447__339.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11798 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_085339__370 | 0 | 0.0 | 82.8339 | 0 | [386, 552] | 0.10.0-DEV | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231227_085339__370.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11799 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | AsIs | 1SHOT | false | false | 6 | 20231214_090642__514 | 0 | 0.0 | 16.9645 | 0 | [48, 506] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231214_090642__514.json | 0.0 | missing | missing | missing | |
| 11800 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | AsIs | 1SHOT | true | false | 6 | 20231225_073210__744 | 0 | 0.0 | 21.5092 | 0 | [70, 391] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_073210__744.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11801 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | AsIs | 1SHOT | true | true | 6 | 20231225_073225__402 | 3 | 0.0 | 14.3815 | 2 | [70, 258] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__AsIs__1SHOT__20231225_073225__402.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11802 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | InJulia | 1SHOT | false | false | 6 | 20231214_090625__335 | 0 | 0.0 | 9.72051 | 0 | [65, 290] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231214_090625__335.json | 0.0 | missing | missing | missing | |
| 11803 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | InJulia | 1SHOT | true | true | 6 | 20231225_073136__448 | 1 | 0.0 | 20.8466 | 2 | [73, 377] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_073136__448.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 11804 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | InJulia | 1SHOT | true | true | 6 | 20231225_073148__216 | 5 | 0.0 | 11.969 | 2 | [73, 212] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_073148__216.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11805 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | InJulia | 1SHOT | true | true | 6 | 20231227_092722__153 | 5 | 0.0 | 9.92689 | 2 | [73, 176] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231227_092722__153.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11806 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231214_090616__453 | 0 | 0.0 | 12.6159 | 0 | [94, 369] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231214_090616__453.json | 0.0 | missing | missing | missing | |
| 11807 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_073109__393 | 5 | 0.0 | 8.43632 | 2 | [111, 140] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_073109__393.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11808 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231225_073116__340 | 0 | 0.0 | 6.42257 | 0 | [111, 103] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_073116__340.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11809 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231227_092712__712 | 0 | 0.0 | 7.36274 | 0 | [111, 122] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231227_092712__712.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11810 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231214_090603__297 | 0 | 0.0 | 15.4316 | 0 | [175, 418] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231214_090603__297.json | 0.0 | missing | missing | missing | |
| 11811 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_073047__989 | 0 | 0.0 | 23.0735 | 0 | [193, 190] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_073047__989.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11812 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_073100__720 | 5 | 0.0 | 13.6661 | 2 | [193, 218] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_073100__720.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11813 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_092705__259 | 5 | 0.0 | 24.7619 | 2 | [193, 257] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231227_092705__259.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11814 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231214_090727__323 | 0 | 0.0 | 16.9242 | 0 | [11, 463] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231214_090727__323.json | 0.0 | missing | missing | missing | |
| 11815 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_073320__589 | 0 | 0.0 | 16.3373 | 2 | [376, 239] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_073320__589.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11816 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_073341__243 | 0 | 0.0 | 20.8104 | 0 | [376, 319] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_073341__243.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11817 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_092748__732 | 5 | 0.0 | 13.4991 | 2 | [376, 192] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231227_092748__732.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11818 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 6 | 20231214_090710__190 | 0 | 0.0 | 27.8737 | 0 | [365, 662] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231214_090710__190.json | 50.0 | missing | missing | missing | |
| 11819 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_073242__878 | 2 | 0.0 | 17.4511 | 2 | [373, 259] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_073242__878.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11820 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_073304__754 | 0 | 0.0 | 21.3841 | 0 | [373, 327] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_073304__754.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11821 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_092735__953 | 0 | 0.0 | 12.1584 | 1 | [373, 168] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231227_092735__953.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11822 | NVIDIA-RTX-4090-4x | wrap_string | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20240201_054258__458 | 6 | 0.0 | 2.10706 | 2 | [0, 155] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_054258__458.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11823 | NVIDIA-RTX-4090-4x | wrap_string | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20240201_054301__654 | 3 | 0.0 | 3.47737 | 0 | [0, 257] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_054301__654.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11824 | NVIDIA-RTX-4090-4x | wrap_string | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20240201_054306__123 | 3 | 0.0 | 4.6414 | 2 | [0, 342] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_054306__123.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11825 | NVIDIA-RTX-4090-4x | wrap_string | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20240201_054309__282 | 1 | 0.0 | 2.62141 | 0 | [0, 194] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_054309__282.json | 54.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 11826 | NVIDIA-RTX-4090-4x | wrap_string | codellama:13b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20240201_054311__358 | 1 | 0.0 | 2.52671 | 2 | [0, 187] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_054311__358.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 11827 | NVIDIA-RTX-4090-4x | wrap_string | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_054225__812 | 4 | 0.0 | 1.8377 | 2 | [0, 140] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_054225__812.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 11828 | NVIDIA-RTX-4090-4x | wrap_string | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_054229__927 | 6 | 0.0 | 4.00916 | 2 | [0, 304] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_054229__927.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11829 | NVIDIA-RTX-4090-4x | wrap_string | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_054233__367 | 5 | 0.0 | 4.13877 | 2 | [0, 313] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_054233__367.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11830 | NVIDIA-RTX-4090-4x | wrap_string | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_054236__933 | 2 | 0.0 | 2.50784 | 2 | [0, 191] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_054236__933.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11831 | NVIDIA-RTX-4090-4x | wrap_string | codellama:13b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_054238__966 | 6 | 0.0 | 2.07072 | 2 | [0, 156] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_054238__966.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11832 | NVIDIA-RTX-4090-4x | wrap_string | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20240201_054159__888 | 0 | 0.0 | 1.0512 | 0 | [0, 79] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_054159__888.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11833 | NVIDIA-RTX-4090-4x | wrap_string | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20240201_054201__817 | 5 | 0.0 | 2.1078 | 2 | [0, 159] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_054201__817.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11834 | NVIDIA-RTX-4090-4x | wrap_string | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20240201_054203__635 | 1 | 0.0 | 2.38281 | 0 | [0, 180] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_054203__635.json | 54.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 11835 | NVIDIA-RTX-4090-4x | wrap_string | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20240201_054205__531 | 0 | 0.0 | 1.59266 | 0 | [0, 120] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_054205__531.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11836 | NVIDIA-RTX-4090-4x | wrap_string | codellama:13b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20240201_054207__295 | 0 | 0.0 | 1.9523 | 2 | [0, 148] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_054207__295.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11837 | NVIDIA-RTX-4090-4x | wrap_string | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20240201_054411__456 | 0 | 0.0 | 2.1963 | 0 | [0, 163] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_054411__456.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11838 | NVIDIA-RTX-4090-4x | wrap_string | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20240201_054415__808 | 0 | 0.0 | 4.60537 | 2 | [0, 339] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_054415__808.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11839 | NVIDIA-RTX-4090-4x | wrap_string | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20240201_054422__525 | 0 | 0.0 | 6.6413 | 1 | [0, 485] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_054422__525.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11840 | NVIDIA-RTX-4090-4x | wrap_string | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20240201_054425__521 | 1 | 0.0 | 3.30265 | 0 | [0, 244] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_054425__521.json | 54.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 11841 | NVIDIA-RTX-4090-4x | wrap_string | codellama:13b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20240201_054430__312 | 0 | 0.0 | 4.32579 | 0 | [0, 319] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_054430__312.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11842 | NVIDIA-RTX-4090-4x | wrap_string | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20240201_054336__197 | 3 | 0.0 | 3.46015 | 2 | [0, 254] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_054336__197.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11843 | NVIDIA-RTX-4090-4x | wrap_string | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20240201_054339__647 | 5 | 0.0 | 3.45091 | 2 | [0, 254] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_054339__647.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11844 | NVIDIA-RTX-4090-4x | wrap_string | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20240201_054344__147 | 0 | 0.0 | 5.034 | 2 | [0, 369] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_054344__147.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11845 | NVIDIA-RTX-4090-4x | wrap_string | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 6 | 20240201_054347__765 | 0 | 0.0 | 2.53763 | 0 | [0, 187] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_054347__765.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11846 | NVIDIA-RTX-4090-4x | wrap_string | codellama:13b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20240201_054352__692 | 1 | 0.0 | 5.36022 | 2 | [0, 391] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_054352__692.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 11847 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | AsIs | 1SHOT | false | false | 6 | 20231214_090823__485 | 0 | 0.0 | 14.3083 | 0 | [48, 430] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__AsIs__1SHOT__20231214_090823__485.json | 0.0 | missing | missing | missing | |
| 11848 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | AsIs | 1SHOT | false | false | 6 | 20231225_073437__290 | 0 | 0.0 | 4.2792 | 0 | [44, 70] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__AsIs__1SHOT__20231225_073437__290.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11849 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | AsIs | 1SHOT | false | false | 6 | 20231225_073440__319 | 0 | 0.0 | 3.06645 | 0 | [44, 47] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__AsIs__1SHOT__20231225_073440__319.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11850 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | InJulia | 1SHOT | false | false | 6 | 20231214_090808__677 | 0 | 0.0 | 12.3796 | 0 | [65, 368] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__InJulia__1SHOT__20231214_090808__677.json | 0.0 | missing | missing | missing | |
| 11851 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | InJulia | 1SHOT | true | true | 6 | 20231225_073421__129 | 0 | 0.0 | 7.84252 | 2 | [47, 139] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_073421__129.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11852 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | InJulia | 1SHOT | false | false | 6 | 20231225_073432__397 | 0 | 0.0 | 11.0038 | 0 | [47, 194] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_073432__397.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11853 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231214_090756__945 | 0 | 0.0 | 10.7007 | 0 | [94, 313] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231214_090756__945.json | 50.0 | missing | missing | missing | |
| 11854 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_073408__771 | 0 | 0.0 | 7.26065 | 0 | [48, 127] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_073408__771.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11855 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231225_073413__579 | 0 | 0.0 | 5.04507 | 0 | [48, 86] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_073413__579.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11856 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231214_090745__660 | 0 | 0.0 | 17.5796 | 0 | [175, 478] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231214_090745__660.json | 50.0 | missing | missing | missing | |
| 11857 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231225_073355__352 | 0 | 0.0 | 13.1916 | 0 | [68, 45] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_073355__352.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11858 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231225_073401__661 | 0 | 0.0 | 6.37326 | 0 | [68, 107] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_073401__661.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11859 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231214_090853__120 | 0 | 0.0 | 10.7516 | 0 | [11, 297] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231214_090853__120.json | 0.0 | missing | missing | missing | |
| 11860 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_073448__912 | 0 | 0.0 | 1.30662 | 0 | [65, 9] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_073448__912.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11861 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_073502__744 | 0 | 0.0 | 13.3298 | 0 | [65, 210] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_073502__744.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11862 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | JuliaRecapTask | 1SHOT | true | true | 6 | 20231214_090842__846 | 0 | 0.0 | 19.358 | 1 | [365, 447] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231214_090842__846.json | 62.5 | missing | missing | missing | |
| 11863 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | JuliaRecapTask | 1SHOT | true | false | 6 | 20231225_073446__924 | 0 | 0.0 | 6.06632 | 0 | [62, 103] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_073446__924.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11864 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 6 | 20231225_073447__382 | 0 | 0.0 | 1.06626 | 0 | [62, 8] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_073447__382.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11865 | NVIDIA-RTX-4090-4x | wrap_string | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20240201_055017__802 | 5 | 0.0 | 5.67271 | 2 | [0, 205] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_055017__802.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11866 | NVIDIA-RTX-4090-4x | wrap_string | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20240201_055026__914 | 0 | 0.0 | 9.13112 | 0 | [0, 330] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_055026__914.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11867 | NVIDIA-RTX-4090-4x | wrap_string | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20240201_055038__436 | 6 | 0.0 | 11.9911 | 2 | [0, 429] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_055038__436.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11868 | NVIDIA-RTX-4090-4x | wrap_string | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20240201_055045__963 | 3 | 0.0 | 6.63276 | 2 | [0, 240] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_055045__963.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11869 | NVIDIA-RTX-4090-4x | wrap_string | codellama:34b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20240201_055054__294 | 0 | 0.0 | 8.76227 | 2 | [0, 317] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:34b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_055054__294.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11870 | NVIDIA-RTX-4090-4x | wrap_string | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_054852__718 | 5 | 0.0 | 5.15913 | 2 | [0, 186] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_054852__718.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11871 | NVIDIA-RTX-4090-4x | wrap_string | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_054857__793 | 3 | 0.0 | 4.98887 | 2 | [0, 180] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_054857__793.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11872 | NVIDIA-RTX-4090-4x | wrap_string | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_054908__858 | 0 | 0.0 | 10.1844 | 0 | [0, 366] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_054908__858.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11873 | NVIDIA-RTX-4090-4x | wrap_string | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_054914__874 | 5 | 0.0 | 6.16499 | 2 | [0, 222] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_054914__874.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11874 | NVIDIA-RTX-4090-4x | wrap_string | codellama:34b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_054921__447 | 5 | 0.0 | 6.82743 | 2 | [0, 246] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_054921__447.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11875 | NVIDIA-RTX-4090-4x | wrap_string | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20240201_054742__839 | 0 | 0.0 | 8.93373 | 2 | [0, 320] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_054742__839.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11876 | NVIDIA-RTX-4090-4x | wrap_string | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20240201_054747__455 | 0 | 0.0 | 4.98592 | 0 | [0, 179] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_054747__455.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11877 | NVIDIA-RTX-4090-4x | wrap_string | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20240201_054755__555 | 5 | 0.0 | 8.30184 | 2 | [0, 297] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_054755__555.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11878 | NVIDIA-RTX-4090-4x | wrap_string | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20240201_054806__732 | 0 | 0.0 | 10.5397 | 0 | [0, 377] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_054806__732.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11879 | NVIDIA-RTX-4090-4x | wrap_string | codellama:34b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20240201_054815__820 | 1 | 0.0 | 8.94145 | 2 | [0, 320] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:34b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_054815__820.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 11880 | NVIDIA-RTX-4090-4x | wrap_string | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20240201_055356__177 | 5 | 0.0 | 25.0009 | 2 | [0, 885] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_055356__177.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11881 | NVIDIA-RTX-4090-4x | wrap_string | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20240201_055410__507 | 3 | 0.0 | 13.0435 | 2 | [0, 463] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_055410__507.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11882 | NVIDIA-RTX-4090-4x | wrap_string | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20240201_055416__757 | 0 | 0.0 | 6.06853 | 0 | [0, 216] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_055416__757.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11883 | NVIDIA-RTX-4090-4x | wrap_string | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20240201_055433__423 | 1 | 0.0 | 17.2873 | 0 | [0, 608] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_055433__423.json | 54.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 11884 | NVIDIA-RTX-4090-4x | wrap_string | codellama:34b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20240201_055446__455 | 0 | 0.0 | 12.9258 | 0 | [0, 454] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_055446__455.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11885 | NVIDIA-RTX-4090-4x | wrap_string | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 6 | 20240201_055156__428 | 0 | 0.0 | 5.3162 | 0 | [0, 190] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_055156__428.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11886 | NVIDIA-RTX-4090-4x | wrap_string | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20240201_055203__580 | 0 | 0.0 | 7.09641 | 2 | [0, 253] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_055203__580.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11887 | NVIDIA-RTX-4090-4x | wrap_string | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 6 | 20240201_055210__175 | 0 | 0.0 | 7.07487 | 0 | [0, 252] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_055210__175.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11888 | NVIDIA-RTX-4090-4x | wrap_string | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20240201_055242__288 | 5 | 0.0 | 31.5575 | 2 | [0, 1114] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_055242__288.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11889 | NVIDIA-RTX-4090-4x | wrap_string | codellama:34b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20240201_055249__556 | 0 | 0.0 | 6.65713 | 0 | [0, 237] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:34b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_055249__556.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11890 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 6 | 20240201_053619__513 | 0 | 0.0 | 21.1885 | 0 | [0, 513] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_053619__513.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11891 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 6 | 20240201_053630__268 | 4 | 0.0 | 11.5716 | 2 | [0, 281] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_053630__268.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 11892 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 6 | 20240201_053641__903 | 4 | 0.0 | 10.9747 | 2 | [0, 267] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_053641__903.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 11893 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q2_K | InJulia | 1SHOT | false | false | 6 | 20240201_053700__380 | 0 | 0.0 | 18.8342 | 0 | [0, 458] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_053700__380.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11894 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q2_K | InJulia | 1SHOT | true | true | 6 | 20240201_053729__387 | 5 | 0.0 | 29.1485 | 1 | [0, 704] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q2_K/evaluation__InJulia__1SHOT__20240201_053729__387.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11895 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_053343__573 | 0 | 0.0 | 15.7541 | 1 | [0, 385] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_053343__573.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11896 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_053358__366 | 3 | 0.0 | 14.4397 | 1 | [0, 352] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_053358__366.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11897 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 6 | 20240201_053409__895 | 0 | 0.0 | 11.6359 | 0 | [0, 286] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_053409__895.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11898 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | false | false | 6 | 20240201_053424__931 | 0 | 0.0 | 14.3262 | 0 | [0, 346] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_053424__931.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11899 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q2_K | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_053435__755 | 0 | 0.0 | 11.8947 | 1 | [0, 292] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q2_K/evaluation__JuliaExpertAsk__1SHOT__20240201_053435__755.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11900 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20240201_053212__325 | 0 | 0.0 | 10.5306 | 0 | [0, 255] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_053212__325.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11901 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20240201_053212__838 | 0 | 0.0 | 0.128407 | 0 | [0, 3] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_053212__838.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11902 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20240201_053213__326 | 0 | 0.0 | 0.126374 | 0 | [0, 3] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_053213__326.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11903 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20240201_053223__343 | 0 | 0.0 | 10.6534 | 0 | [0, 258] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_053223__343.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11904 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q2_K | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20240201_053226__227 | 0 | 0.0 | 3.25533 | 0 | [0, 79] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q2_K/evaluation__JuliaExpertCoTTask__1SHOT__20240201_053226__227.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11905 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20240201_054041__299 | 0 | 0.0 | 1.60252 | 0 | [0, 39] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_054041__299.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11906 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20240201_054042__515 | 0 | 0.0 | 1.56081 | 0 | [0, 38] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_054042__515.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11907 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20240201_054101__455 | 3 | 0.0 | 18.0915 | 2 | [0, 437] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_054101__455.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11908 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20240201_054116__321 | 2 | 0.0 | 15.3623 | 2 | [0, 372] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_054116__321.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11909 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q2_K | JuliaRecapCoTTask | 1SHOT | true | false | 6 | 20240201_054134__789 | 0 | 0.0 | 18.5372 | 0 | [0, 448] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q2_K/evaluation__JuliaRecapCoTTask__1SHOT__20240201_054134__789.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11910 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 6 | 20240201_053852__406 | 0 | 0.0 | 9.34001 | 0 | [0, 226] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_053852__406.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11911 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 6 | 20240201_053907__479 | 3 | 0.0 | 14.6692 | 2 | [0, 354] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_053907__479.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11912 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | true | 6 | 20240201_053920__954 | 5 | 0.0 | 13.3124 | 2 | [0, 323] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_053920__954.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11913 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | false | false | 6 | 20240201_053929__828 | 0 | 0.0 | 8.491 | 0 | [0, 206] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_053929__828.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11914 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q2_K | JuliaRecapTask | 1SHOT | true | false | 6 | 20240201_053948__709 | 0 | 0.0 | 18.7489 | 0 | [0, 454] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q2_K/evaluation__JuliaRecapTask__1SHOT__20240201_053948__709.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11915 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20240201_052244__712 | 3 | 0.0 | 23.5354 | 1 | [0, 440] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_052244__712.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11916 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20240201_052257__746 | 2 | 0.0 | 12.7448 | 2 | [0, 238] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_052257__746.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11917 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20240201_052319__423 | 5 | 0.0 | 21.845 | 2 | [0, 406] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_052319__423.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11918 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 6 | 20240201_052341__552 | 0 | 0.0 | 21.4266 | 0 | [0, 397] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_052341__552.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11919 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20240201_052415__298 | 4 | 0.0 | 34.8133 | 2 | [0, 645] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_052415__298.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 11920 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_051851__158 | 4 | 0.0 | 26.5359 | 2 | [0, 491] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_051851__158.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 11921 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 6 | 20240201_051913__161 | 0 | 0.0 | 22.8458 | 0 | [0, 425] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_051913__161.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11922 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 6 | 20240201_051926__209 | 0 | 0.0 | 12.0299 | 0 | [0, 225] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_051926__209.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11923 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 6 | 20240201_051947__569 | 0 | 0.0 | 21.2323 | 0 | [0, 396] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_051947__569.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11924 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 6 | 20240201_052006__924 | 0 | 0.0 | 19.1708 | 0 | [0, 358] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_052006__924.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11925 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20240201_051559__541 | 0 | 0.0 | 11.0459 | 0 | [0, 204] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_051559__541.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11926 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20240201_051623__534 | 0 | 0.0 | 24.7812 | 1 | [0, 457] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_051623__534.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11927 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20240201_051630__220 | 0 | 0.0 | 6.39402 | 0 | [0, 119] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_051630__220.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11928 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20240201_051654__187 | 0 | 0.0 | 24.0605 | 0 | [0, 444] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_051654__187.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11929 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20240201_051657__320 | 0 | 0.0 | 3.10444 | 0 | [0, 58] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_051657__320.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11930 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20240201_052919__978 | 1 | 0.0 | 16.1809 | 2 | [0, 299] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_052919__978.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 11931 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20240201_052942__152 | 0 | 0.0 | 23.2568 | 0 | [0, 422] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_052942__152.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11932 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20240201_053006__524 | 0 | 0.0 | 23.5493 | 1 | [0, 427] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_053006__524.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11933 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 6 | 20240201_053021__281 | 0 | 0.0 | 15.3254 | 0 | [0, 283] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_053021__281.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11934 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20240201_053048__405 | 5 | 0.0 | 27.3611 | 2 | [0, 503] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_053048__405.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11935 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 6 | 20240201_052605__470 | 0 | 0.0 | 19.0761 | 0 | [0, 351] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_052605__470.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11936 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 6 | 20240201_052628__747 | 0 | 0.0 | 22.9801 | 0 | [0, 424] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_052628__747.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11937 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 6 | 20240201_052641__866 | 0 | 0.0 | 12.8194 | 0 | [0, 237] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_052641__866.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11938 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20240201_052702__355 | 1 | 0.0 | 20.9097 | 2 | [0, 385] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_052702__355.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 11939 | NVIDIA-RTX-4090-4x | wrap_string | codellama:70b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 6 | 20240201_052711__636 | 0 | 0.0 | 9.18686 | 0 | [0, 170] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:70b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_052711__636.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11940 | NVIDIA-RTX-4090-4x | wrap_string | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20240201_054525__100 | 4 | 0.0 | 3.40038 | 2 | [0, 406] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_054525__100.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 11941 | NVIDIA-RTX-4090-4x | wrap_string | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20240201_054527__306 | 3 | 0.0 | 2.72605 | 2 | [0, 328] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_054527__306.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11942 | NVIDIA-RTX-4090-4x | wrap_string | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20240201_054530__746 | 0 | 0.0 | 2.30826 | 0 | [0, 278] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_054530__746.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11943 | NVIDIA-RTX-4090-4x | wrap_string | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20240201_054533__268 | 5 | 0.0 | 2.82299 | 2 | [0, 338] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_054533__268.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11944 | NVIDIA-RTX-4090-4x | wrap_string | codellama:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 6 | 20240201_054535__867 | 0 | 0.0 | 2.22407 | 0 | [0, 267] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20240201_054535__867.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11945 | NVIDIA-RTX-4090-4x | wrap_string | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 6 | 20240201_054502__497 | 0 | 0.0 | 0.995548 | 0 | [0, 119] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_054502__497.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11946 | NVIDIA-RTX-4090-4x | wrap_string | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_054503__329 | 0 | 0.0 | 1.42274 | 0 | [0, 171] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_054503__329.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11947 | NVIDIA-RTX-4090-4x | wrap_string | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_054506__330 | 1 | 0.0 | 2.51414 | 0 | [0, 300] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_054506__330.json | 54.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 11948 | NVIDIA-RTX-4090-4x | wrap_string | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_054507__701 | 3 | 0.0 | 1.23407 | 2 | [0, 148] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_054507__701.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11949 | NVIDIA-RTX-4090-4x | wrap_string | codellama:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_054509__823 | 4 | 0.0 | 1.67103 | 2 | [0, 201] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20240201_054509__823.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 11950 | NVIDIA-RTX-4090-4x | wrap_string | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20240201_054446__412 | 5 | 0.0 | 1.86127 | 2 | [0, 222] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_054446__412.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11951 | NVIDIA-RTX-4090-4x | wrap_string | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20240201_054447__429 | 0 | 0.0 | 1.28016 | 0 | [0, 153] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_054447__429.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11952 | NVIDIA-RTX-4090-4x | wrap_string | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20240201_054449__759 | 5 | 0.0 | 1.9781 | 2 | [0, 234] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_054449__759.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11953 | NVIDIA-RTX-4090-4x | wrap_string | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20240201_054451__193 | 0 | 0.0 | 1.18099 | 0 | [0, 141] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_054451__193.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11954 | NVIDIA-RTX-4090-4x | wrap_string | codellama:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20240201_054452__647 | 1 | 0.0 | 1.47329 | 0 | [0, 175] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20240201_054452__647.json | 54.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 11955 | NVIDIA-RTX-4090-4x | wrap_string | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 6 | 20240201_054617__112 | 0 | 0.0 | 1.34594 | 0 | [0, 158] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_054617__112.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11956 | NVIDIA-RTX-4090-4x | wrap_string | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20240201_054620__926 | 0 | 0.0 | 3.04231 | 2 | [0, 351] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_054620__926.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11957 | NVIDIA-RTX-4090-4x | wrap_string | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20240201_054623__818 | 5 | 0.0 | 2.04901 | 2 | [0, 235] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_054623__818.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11958 | NVIDIA-RTX-4090-4x | wrap_string | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20240201_054624__923 | 5 | 0.0 | 1.51375 | 2 | [0, 178] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_054624__923.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11959 | NVIDIA-RTX-4090-4x | wrap_string | codellama:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20240201_054628__404 | 3 | 0.0 | 3.69319 | 2 | [0, 429] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20240201_054628__404.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11960 | NVIDIA-RTX-4090-4x | wrap_string | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20240201_054549__879 | 4 | 0.0 | 1.82914 | 2 | [0, 215] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_054549__879.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 11961 | NVIDIA-RTX-4090-4x | wrap_string | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20240201_054551__965 | 3 | 0.0 | 2.23532 | 2 | [0, 262] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_054551__965.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11962 | NVIDIA-RTX-4090-4x | wrap_string | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20240201_054553__248 | 0 | 0.0 | 1.60577 | 2 | [0, 189] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_054553__248.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11963 | NVIDIA-RTX-4090-4x | wrap_string | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20240201_054555__171 | 2 | 0.0 | 1.91058 | 0 | [0, 225] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_054555__171.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11964 | NVIDIA-RTX-4090-4x | wrap_string | codellama:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20240201_054557__275 | 2 | 0.0 | 2.04889 | 2 | [0, 241] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20240201_054557__275.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11965 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | true | true | 6 | 20231225_075816__350 | 0 | 0.0 | 61.1953 | 0 | [62, 340] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_075816__350.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11966 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | AsIs | 1SHOT | true | true | 6 | 20231225_075911__805 | 1 | 0.0 | 54.63 | 0 | [62, 328] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_075911__805.json | 54.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 11967 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231225_075628__848 | 0 | 0.0 | 59.0252 | 1 | [65, 328] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_075628__848.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11968 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231225_075713__902 | 0 | 0.0 | 43.7698 | 1 | [65, 233] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_075713__902.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11969 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231227_093836__439 | 0 | 0.0 | 64.2913 | 1 | [65, 391] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_093836__439.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11970 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_075417__621 | 0 | 0.0 | 60.2185 | 1 | [106, 321] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_075417__621.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11971 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231225_075528__670 | 0 | 0.0 | 69.9139 | 0 | [106, 402] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_075528__670.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11972 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_093731__258 | 0 | 0.0 | 93.3468 | 0 | [106, 561] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_093731__258.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11973 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20231225_075209__622 | 0 | 0.0 | 80.5325 | 0 | [187, 273] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_075209__622.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11974 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_075314__780 | 0 | 0.0 | 64.1921 | 0 | [187, 363] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_075314__780.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11975 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_093557__864 | 0 | 0.0 | 58.4494 | 1 | [187, 187] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_093557__864.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11976 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_080319__125 | 0 | 0.0 | 86.019 | 1 | [394, 444] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_080319__125.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11977 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_080438__663 | 6 | 0.0 | 77.0306 | 2 | [394, 385] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_080438__663.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11978 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_094112__111 | 3 | 0.0 | 55.5684 | 2 | [394, 279] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_094112__111.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 11979 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 6 | 20231225_080054__598 | 0 | 0.0 | 102.021 | 0 | [392, 545] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_080054__598.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11980 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_080151__170 | 5 | 0.0 | 56.3985 | 2 | [392, 269] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_080151__170.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 11981 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_094016__865 | 0 | 0.0 | 100.144 | 0 | [392, 545] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_094016__865.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11982 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 6 | 20231227_094850__763 | 0 | 0.0 | 12.2957 | 0 | [63, 475] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_094850__763.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11983 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 6 | 20231227_133332__431 | 0 | 0.0 | 8.17932 | 0 | [63, 319] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_133332__431.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11984 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | true | 6 | 20231227_133340__643 | 0 | 0.0 | 7.88626 | 0 | [63, 308] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_133340__643.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11985 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 6 | 20231227_133351__437 | 0 | 0.0 | 10.9243 | 0 | [63, 424] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_133351__437.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11986 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231227_094838__305 | 0 | 0.0 | 9.04442 | 0 | [100, 344] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_094838__305.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11987 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231227_133306__633 | 0 | 0.0 | 8.49489 | 0 | [100, 323] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_133306__633.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11988 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231227_133318__832 | 0 | 0.0 | 12.4017 | 0 | [100, 470] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_133318__832.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11989 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231227_133324__452 | 0 | 0.0 | 5.32769 | 0 | [100, 200] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_133324__452.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11990 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_094829__867 | 0 | 0.0 | 7.46961 | 0 | [179, 142] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_094829__867.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11991 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231227_133237__971 | 0 | 0.0 | 11.0759 | 0 | [179, 285] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_133237__971.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11992 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231227_133241__680 | 0 | 0.0 | 4.42447 | 0 | [179, 154] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_133241__680.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11993 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_133257__785 | 0 | 0.0 | 16.2088 | 0 | [179, 592] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_133257__785.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11994 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231227_094915__772 | 0 | 0.0 | 12.9103 | 0 | [352, 437] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_094915__772.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11995 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_133424__864 | 0 | 0.0 | 11.284 | 0 | [352, 379] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_133424__864.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11996 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231227_133433__902 | 0 | 0.0 | 8.9297 | 0 | [352, 295] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_133433__902.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11997 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_133444__708 | 0 | 0.0 | 11.1893 | 0 | [352, 376] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_133444__708.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11998 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_094902__879 | 1 | 0.0 | 11.8357 | 0 | [349, 399] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_094902__879.json | 54.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 11999 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 6 | 20231227_133359__271 | 0 | 0.0 | 8.03622 | 0 | [349, 262] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_133359__271.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12000 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 6 | 20231227_133405__498 | 0 | 0.0 | 6.15165 | 0 | [349, 192] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_133405__498.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12001 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_133413__604 | 0 | 0.0 | 8.00994 | 0 | [349, 261] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_133413__604.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12002 | Apple-MacBook-Pro-M1 | wrap_string | gemini-1.0-pro-latest | InJulia | 1SHOT | false | false | 6 | 20240217_113649__937 | 0 | 0.0 | 9.62214 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_113649__937.json | 0.0 | missing | missing | missing | |
| 12003 | Apple-MacBook-Pro-M1 | wrap_string | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 6 | 20240217_113659__139 | 0 | 0.0 | 9.49496 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_113659__139.json | 50.0 | missing | missing | missing | |
| 12004 | Apple-MacBook-Pro-M1 | wrap_string | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 6 | 20240217_113707__366 | 1 | 0.0 | 7.8586 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_113707__366.json | 54.1667 | missing | missing | missing | |
| 12005 | Apple-MacBook-Pro-M1 | wrap_string | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 6 | 20240217_115002__881 | 0 | 0.0 | 6.00058 | 0 | [242, 1216] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_115002__881.json | 50.0 | missing | missing | missing | |
| 12006 | Apple-MacBook-Pro-M1 | wrap_string | gemini-1.0-pro-latest | InJulia | 1SHOT | true | true | 6 | 20240217_125528__668 | 0 | 0.0 | 7.21301 | 0 | [242, 820] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemini-1.0-pro-latest/evaluation__InJulia__1SHOT__20240217_125528__668.json | 50.0 | missing | missing | missing | |
| 12007 | Apple-MacBook-Pro-M1 | wrap_string | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240217_112642__722 | 1 | 0.0 | 3.61043 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_112642__722.json | 54.1667 | missing | missing | missing | |
| 12008 | Apple-MacBook-Pro-M1 | wrap_string | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | false | false | 6 | 20240217_112650__417 | 0 | 0.0 | 7.60062 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_112650__417.json | 0.0 | missing | missing | missing | |
| 12009 | Apple-MacBook-Pro-M1 | wrap_string | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | false | 6 | 20240217_112653__762 | 0 | 0.0 | 3.42996 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_112653__762.json | 25.0 | missing | missing | missing | |
| 12010 | Apple-MacBook-Pro-M1 | wrap_string | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | false | false | 6 | 20240217_113744__906 | 0 | 0.0 | 2.59143 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_113744__906.json | 0.0 | missing | missing | missing | |
| 12011 | Apple-MacBook-Pro-M1 | wrap_string | gemini-1.0-pro-latest | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240217_113756__859 | 3 | 0.0 | 11.5248 | 2 | [0, 0] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemini-1.0-pro-latest/evaluation__JuliaExpertAsk__1SHOT__20240217_113756__859.json | 87.5 | missing | missing | missing | |
| 12012 | Apple-MacBook-Pro-M1 | wrap_string | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20240217_112552__261 | 0 | 0.0 | 2.50331 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_112552__261.json | 25.0 | missing | missing | missing | |
| 12013 | Apple-MacBook-Pro-M1 | wrap_string | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20240217_112556__347 | 0 | 0.0 | 3.3541 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_112556__347.json | 25.0 | missing | missing | missing | |
| 12014 | Apple-MacBook-Pro-M1 | wrap_string | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20240217_112603__537 | 0 | 0.0 | 6.84447 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_112603__537.json | 0.0 | missing | missing | missing | |
| 12015 | Apple-MacBook-Pro-M1 | wrap_string | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20240217_112605__754 | 0 | 0.0 | 2.01272 | 1 | [0, 0] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_112605__754.json | 62.5 | missing | missing | missing | |
| 12016 | Apple-MacBook-Pro-M1 | wrap_string | gemini-1.0-pro-latest | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20240217_112607__951 | 0 | 0.0 | 2.4106 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemini-1.0-pro-latest/evaluation__JuliaExpertCoTTask__1SHOT__20240217_112607__951.json | 25.0 | missing | missing | missing | |
| 12017 | Apple-MacBook-Pro-M1 | wrap_string | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 6 | 20240217_112927__348 | 0 | 0.0 | 2.85322 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_112927__348.json | 25.0 | missing | missing | missing | |
| 12018 | Apple-MacBook-Pro-M1 | wrap_string | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 6 | 20240217_112937__696 | 0 | 0.0 | 2.40112 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_112937__696.json | 25.0 | missing | missing | missing | |
| 12019 | Apple-MacBook-Pro-M1 | wrap_string | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20240217_113733__258 | 0 | 0.0 | 11.9846 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_113733__258.json | 0.0 | missing | missing | missing | |
| 12020 | Apple-MacBook-Pro-M1 | wrap_string | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20240217_113736__613 | 0 | 0.0 | 3.36043 | 2 | [0, 0] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_113736__613.json | 75.0 | missing | missing | missing | |
| 12021 | Apple-MacBook-Pro-M1 | wrap_string | gemini-1.0-pro-latest | JuliaRecapCoTTask | 1SHOT | true | false | 6 | 20240217_115005__853 | 0 | 0.0 | 3.42869 | 0 | [1361, 489] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemini-1.0-pro-latest/evaluation__JuliaRecapCoTTask__1SHOT__20240217_115005__853.json | 25.0 | missing | missing | missing | |
| 12022 | Apple-MacBook-Pro-M1 | wrap_string | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | false | false | 6 | 20240217_112841__536 | 0 | 0.0 | 1.69968 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_112841__536.json | 0.0 | missing | missing | missing | |
| 12023 | Apple-MacBook-Pro-M1 | wrap_string | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 6 | 20240217_112845__982 | 5 | 0.0 | 4.52638 | 2 | [0, 0] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_112845__982.json | 95.8333 | missing | missing | missing | |
| 12024 | Apple-MacBook-Pro-M1 | wrap_string | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | true | 6 | 20240217_112850__183 | 0 | 0.0 | 4.52645 | 2 | [0, 0] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_112850__183.json | 75.0 | missing | missing | missing | |
| 12025 | Apple-MacBook-Pro-M1 | wrap_string | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | false | false | 6 | 20240217_113801__879 | 0 | 0.0 | 4.95872 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_113801__879.json | 0.0 | missing | missing | missing | |
| 12026 | Apple-MacBook-Pro-M1 | wrap_string | gemini-1.0-pro-latest | JuliaRecapTask | 1SHOT | true | false | 6 | 20240217_113806__847 | 0 | 0.0 | 5.3302 | 0 | [0, 0] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.GoogleSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemini-1.0-pro-latest/evaluation__JuliaRecapTask__1SHOT__20240217_113806__847.json | 25.0 | missing | missing | missing | |
| 12027 | Apple-MacBook-Pro-M1 | wrap_string | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 6 | 20240224_010200__127 | 0 | 0.0 | 29.2157 | 0 | [0, 320] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240224_010200__127.json | 0.0 | missing | missing | missing | |
| 12028 | Apple-MacBook-Pro-M1 | wrap_string | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 6 | 20240224_010237__131 | 0 | 0.0 | 36.4037 | 0 | [0, 399] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240224_010237__131.json | 0.0 | missing | missing | missing | |
| 12029 | Apple-MacBook-Pro-M1 | wrap_string | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 6 | 20240224_010309__137 | 0 | 0.0 | 32.6113 | 0 | [0, 359] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240224_010309__137.json | 0.0 | missing | missing | missing | |
| 12030 | Apple-MacBook-Pro-M1 | wrap_string | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 6 | 20240224_010347__855 | 0 | 0.0 | 38.0352 | 0 | [0, 419] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240224_010347__855.json | 0.0 | missing | missing | missing | |
| 12031 | Apple-MacBook-Pro-M1 | wrap_string | gemma:7b-instruct-q6_K | InJulia | 1SHOT | false | false | 6 | 20240224_010418__770 | 0 | 0.0 | 31.2992 | 0 | [0, 347] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemma:7b-instruct-q6_K/evaluation__InJulia__1SHOT__20240224_010418__770.json | 0.0 | missing | missing | missing | |
| 12032 | Apple-MacBook-Pro-M1 | wrap_string | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 6 | 20240224_005742__392 | 0 | 0.0 | 31.7139 | 0 | [0, 339] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240224_005742__392.json | 0.0 | missing | missing | missing | |
| 12033 | Apple-MacBook-Pro-M1 | wrap_string | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 6 | 20240224_005807__110 | 0 | 0.0 | 24.8076 | 0 | [0, 266] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240224_005807__110.json | 0.0 | missing | missing | missing | |
| 12034 | Apple-MacBook-Pro-M1 | wrap_string | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 6 | 20240224_005831__340 | 0 | 0.0 | 23.3006 | 0 | [0, 250] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240224_005831__340.json | 0.0 | missing | missing | missing | |
| 12035 | Apple-MacBook-Pro-M1 | wrap_string | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 6 | 20240224_005856__117 | 0 | 0.0 | 24.9156 | 0 | [0, 268] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240224_005856__117.json | 0.0 | missing | missing | missing | |
| 12036 | Apple-MacBook-Pro-M1 | wrap_string | gemma:7b-instruct-q6_K | JuliaExpertAsk | 1SHOT | false | false | 6 | 20240224_005918__472 | 0 | 0.0 | 22.043 | 0 | [0, 238] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemma:7b-instruct-q6_K/evaluation__JuliaExpertAsk__1SHOT__20240224_005918__472.json | 0.0 | missing | missing | missing | |
| 12037 | Apple-MacBook-Pro-M1 | wrap_string | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20240224_005247__348 | 0 | 0.0 | 34.5107 | 0 | [0, 368] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240224_005247__348.json | 0.0 | missing | missing | missing | |
| 12038 | Apple-MacBook-Pro-M1 | wrap_string | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20240224_005310__877 | 0 | 0.0 | 23.4004 | 0 | [0, 250] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240224_005310__877.json | 0.0 | missing | missing | missing | |
| 12039 | Apple-MacBook-Pro-M1 | wrap_string | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20240224_005343__200 | 0 | 0.0 | 32.4752 | 0 | [0, 346] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240224_005343__200.json | 0.0 | missing | missing | missing | |
| 12040 | Apple-MacBook-Pro-M1 | wrap_string | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20240224_005422__321 | 0 | 0.0 | 39.5223 | 0 | [0, 421] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240224_005422__321.json | 0.0 | missing | missing | missing | |
| 12041 | Apple-MacBook-Pro-M1 | wrap_string | gemma:7b-instruct-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20240224_005502__655 | 0 | 0.0 | 39.4797 | 0 | [0, 420] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemma:7b-instruct-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20240224_005502__655.json | 0.0 | missing | missing | missing | |
| 12042 | Apple-MacBook-Pro-M1 | wrap_string | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 6 | 20240224_011355__788 | 0 | 0.0 | 25.9101 | 0 | [0, 394] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240224_011355__788.json | 25.0 | missing | missing | missing | |
| 12043 | Apple-MacBook-Pro-M1 | wrap_string | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20240224_011423__690 | 0 | 0.0 | 28.1962 | 0 | [0, 390] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240224_011423__690.json | 0.0 | missing | missing | missing | |
| 12044 | Apple-MacBook-Pro-M1 | wrap_string | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20240224_011451__699 | 0 | 0.0 | 28.2003 | 0 | [0, 376] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240224_011451__699.json | 0.0 | missing | missing | missing | |
| 12045 | Apple-MacBook-Pro-M1 | wrap_string | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20240224_011521__896 | 0 | 0.0 | 30.1324 | 0 | [0, 435] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240224_011521__896.json | 0.0 | missing | missing | missing | |
| 12046 | Apple-MacBook-Pro-M1 | wrap_string | gemma:7b-instruct-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20240224_011551__307 | 0 | 0.0 | 29.5151 | 0 | [0, 342] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemma:7b-instruct-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20240224_011551__307.json | 0.0 | missing | missing | missing | |
| 12047 | Apple-MacBook-Pro-M1 | wrap_string | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 6 | 20240224_010804__343 | 0 | 0.0 | 37.6858 | 0 | [0, 425] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240224_010804__343.json | 0.0 | missing | missing | missing | |
| 12048 | Apple-MacBook-Pro-M1 | wrap_string | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 6 | 20240224_010841__461 | 0 | 0.0 | 36.8604 | 0 | [0, 416] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240224_010841__461.json | 0.0 | missing | missing | missing | |
| 12049 | Apple-MacBook-Pro-M1 | wrap_string | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 6 | 20240224_010919__308 | 0 | 0.0 | 38.3968 | 0 | [0, 435] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240224_010919__308.json | 0.0 | missing | missing | missing | |
| 12050 | Apple-MacBook-Pro-M1 | wrap_string | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 6 | 20240224_010957__458 | 0 | 0.0 | 38.0809 | 0 | [0, 428] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240224_010957__458.json | 0.0 | missing | missing | missing | |
| 12051 | Apple-MacBook-Pro-M1 | wrap_string | gemma:7b-instruct-q6_K | JuliaRecapTask | 1SHOT | false | false | 6 | 20240224_011032__828 | 0 | 0.0 | 34.5959 | 0 | [0, 388] | 0.13.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gemma:7b-instruct-q6_K/evaluation__JuliaRecapTask__1SHOT__20240224_011032__828.json | 0.0 | missing | missing | missing | |
| 12052 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | AsIs | 1SHOT | true | true | 6 | 20231213_210245__201 | 0 | 0.0004925 | 7.72203 | 0 | [55, 310] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231213_210245__201.json | 50.0 | missing | missing | missing | |
| 12053 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | AsIs | 1SHOT | true | true | 6 | 20231225_231530__809 | 5 | 0.0003875 | 3.65272 | 2 | [55, 240] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_231530__809.json | 95.8333 | missing | missing | missing | |
| 12054 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | AsIs | 1SHOT | true | true | 6 | 20231225_231533__295 | 2 | 0.00038 | 3.29023 | 2 | [55, 235] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231225_231533__295.json | 83.3333 | missing | missing | missing | |
| 12055 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo--optim | AsIs | 1SHOT | true | true | 6 | 20231215_202418__218 | 2 | 0.0 | 6.76262 | 2 | [55, 278] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__AsIs__1SHOT__20231215_202418__218.json | 83.3333 | 0.5 | missing | 0.5 | |
| 12056 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 6 | 20231213_210237__448 | 0 | 0.0004325 | 7.33414 | 1 | [58, 269] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231213_210237__448.json | 62.5 | missing | missing | missing | |
| 12057 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 6 | 20231225_231522__729 | 2 | 0.0004325 | 3.75377 | 2 | [58, 269] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_231522__729.json | 83.3333 | missing | missing | missing | |
| 12058 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 6 | 20231225_231526__514 | 3 | 0.000437 | 4.27794 | 2 | [58, 272] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231225_231526__514.json | 87.5 | missing | missing | missing | |
| 12059 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 6 | 20231227_210826__903 | 2 | 0.000458 | 4.59774 | 2 | [58, 286] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_210826__903.json | 83.3333 | missing | missing | missing | |
| 12060 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | InJulia | 1SHOT | true | true | 6 | 20231227_210831__578 | 3 | 0.0004745 | 4.75648 | 2 | [58, 297] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231227_210831__578.json | 87.5 | missing | missing | missing | |
| 12061 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo--optim | InJulia | 1SHOT | true | true | 6 | 20231215_202411__861 | 2 | 0.0 | 6.56134 | 2 | [58, 322] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__InJulia__1SHOT__20231215_202411__861.json | 83.3333 | 0.5 | missing | 0.5 | |
| 12062 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231213_210230__559 | 0 | 0.0004395 | 5.87911 | 1 | [93, 262] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231213_210230__559.json | 62.5 | missing | missing | missing | |
| 12063 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_231515__772 | 2 | 0.000372 | 3.02088 | 2 | [93, 217] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_231515__772.json | 83.3333 | missing | missing | missing | |
| 12064 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_231518__993 | 3 | 0.00027 | 2.48965 | 2 | [93, 149] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231225_231518__993.json | 87.5 | missing | missing | missing | |
| 12065 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_210817__882 | 2 | 0.000288 | 3.01755 | 2 | [93, 161] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_210817__882.json | 83.3333 | missing | missing | missing | |
| 12066 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_210821__207 | 5 | 0.000396 | 4.14596 | 2 | [93, 233] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231227_210821__207.json | 95.8333 | missing | missing | missing | |
| 12067 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo--optim | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231215_202404__264 | 4 | 0.0 | 5.95609 | 2 | [93, 208] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__JuliaExpertAsk__1SHOT__20231215_202404__264.json | 91.6667 | 0.5 | missing | 0.5 | |
| 12068 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231213_210224__262 | 0 | 0.0003375 | 4.93106 | 0 | [162, 171] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231213_210224__262.json | 0.0 | missing | missing | missing | |
| 12069 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231225_231510__696 | 0 | 0.000255 | 1.93263 | 0 | [162, 116] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_231510__696.json | 0.0 | missing | missing | missing | |
| 12070 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231225_231512__489 | 0 | 0.0002775 | 1.96316 | 0 | [162, 131] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231225_231512__489.json | 0.0 | missing | missing | missing | |
| 12071 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_210807__234 | 4 | 0.0004365 | 4.21913 | 2 | [162, 237] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_210807__234.json | 91.6667 | missing | missing | missing | |
| 12072 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_210813__445 | 3 | 0.000615 | 6.22863 | 2 | [162, 356] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231227_210813__445.json | 87.5 | missing | missing | missing | |
| 12073 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo--optim | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231215_202358__641 | 2 | 0.0 | 5.76228 | 2 | [162, 283] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__JuliaExpertCoTTask__1SHOT__20231215_202358__641.json | 83.3333 | 0.5 | missing | 0.5 | |
| 12074 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231213_210300__512 | 2 | 0.000691 | 8.27198 | 2 | [317, 355] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231213_210300__512.json | 83.3333 | missing | missing | missing | |
| 12075 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_231543__839 | 2 | 0.000628 | 4.40117 | 2 | [317, 313] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_231543__839.json | 83.3333 | missing | missing | missing | |
| 12076 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_231548__270 | 2 | 0.000706 | 4.79246 | 2 | [317, 365] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231225_231548__270.json | 83.3333 | missing | missing | missing | |
| 12077 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231227_210847__960 | 0 | 0.000385 | 2.49689 | 0 | [317, 151] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_210847__960.json | 0.0 | missing | missing | missing | |
| 12078 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_210852__760 | 2 | 0.000718 | 5.72491 | 2 | [317, 373] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231227_210852__760.json | 83.3333 | missing | missing | missing | |
| 12079 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo--optim | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231215_202437__198 | 2 | 0.0 | 7.96926 | 2 | [317, 349] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__JuliaRecapCoTTask__1SHOT__20231215_202437__198.json | 83.3333 | 0.5 | missing | 0.5 | |
| 12080 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | true | true | 6 | 20231213_210252__350 | 5 | 0.000554 | 6.37124 | 2 | [316, 264] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231213_210252__350.json | 95.8333 | missing | missing | missing | |
| 12081 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 6 | 20231225_231537__938 | 0 | 0.000467 | 3.13612 | 0 | [316, 206] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_231537__938.json | 0.0 | missing | missing | missing | |
| 12082 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | false | false | 6 | 20231225_231538__997 | 0 | 0.000305 | 1.8485 | 0 | [316, 98] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231225_231538__997.json | 0.0 | missing | missing | missing | |
| 12083 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_210835__444 | 2 | 0.000644 | 4.7788 | 2 | [316, 324] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_210835__444.json | 83.3333 | missing | missing | missing | |
| 12084 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_210844__345 | 4 | 0.0009455 | 8.3696 | 2 | [316, 525] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231227_210844__345.json | 91.6667 | missing | missing | missing | |
| 12085 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo--optim | JuliaRecapTask | 1SHOT | true | true | 6 | 20231215_202429__736 | 2 | 0.0 | 10.9834 | 2 | [316, 487] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.5,\n "top_p": 0.5\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo/evaluation__JuliaRecapTask__1SHOT__20231215_202429__736.json | 83.3333 | 0.5 | missing | 0.5 | |
| 12086 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 6 | 20240201_200748__123 | 2 | 0.000395 | 2.00945 | 2 | [58, 244] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200748__123.json | 83.3333 | missing | missing | missing | |
| 12087 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 6 | 20240201_200750__730 | 2 | 0.0003635 | 2.02334 | 2 | [58, 223] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200750__730.json | 83.3333 | missing | missing | missing | |
| 12088 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 6 | 20240201_200752__865 | 0 | 0.00035 | 1.74514 | 1 | [58, 214] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200752__865.json | 62.5 | missing | missing | missing | |
| 12089 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 6 | 20240201_200754__377 | 2 | 0.000377 | 2.07693 | 2 | [58, 232] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200754__377.json | 83.3333 | missing | missing | missing | |
| 12090 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-0125 | InJulia | 1SHOT | true | true | 6 | 20240201_200756__722 | 4 | 0.0003455 | 1.71972 | 2 | [58, 211] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-0125/evaluation__InJulia__1SHOT__20240201_200756__722.json | 91.6667 | missing | missing | missing | |
| 12091 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_200741__611 | 0 | 0.000288 | 1.23596 | 1 | [93, 161] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200741__611.json | 62.5 | missing | missing | missing | |
| 12092 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_200743__346 | 0 | 0.0002715 | 1.36623 | 1 | [93, 150] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200743__346.json | 62.5 | missing | missing | missing | |
| 12093 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_200744__659 | 0 | 0.000243 | 1.31918 | 1 | [93, 131] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200744__659.json | 62.5 | missing | missing | missing | |
| 12094 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_200745__740 | 0 | 0.000255 | 1.10023 | 1 | [93, 139] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200745__740.json | 62.5 | missing | missing | missing | |
| 12095 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-0125 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_200746__219 | 0 | 0.0002775 | 1.16611 | 1 | [93, 154] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-0125/evaluation__JuliaExpertAsk__1SHOT__20240201_200746__219.json | 62.5 | missing | missing | missing | |
| 12096 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20240201_200733__115 | 3 | 0.0004335 | 1.67193 | 2 | [162, 235] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200733__115.json | 87.5 | missing | missing | missing | |
| 12097 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20240201_200735__388 | 4 | 0.0004785 | 1.84667 | 2 | [162, 265] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200735__388.json | 91.6667 | missing | missing | missing | |
| 12098 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20240201_200737__849 | 2 | 0.0005085 | 2.19596 | 2 | [162, 285] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200737__849.json | 83.3333 | missing | missing | missing | |
| 12099 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20240201_200739__868 | 0 | 0.0004335 | 1.67928 | 1 | [162, 235] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200739__868.json | 62.5 | missing | missing | missing | |
| 12100 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-0125 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20240201_200740__614 | 4 | 0.0002835 | 1.31898 | 2 | [162, 135] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-0125/evaluation__JuliaExpertCoTTask__1SHOT__20240201_200740__614.json | 91.6667 | missing | missing | missing | |
| 12101 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20240201_200807__613 | 0 | 0.000283 | 0.795254 | 0 | [317, 83] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200807__613.json | 0.0 | missing | missing | missing | |
| 12102 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20240201_200807__707 | 0 | 0.000283 | 0.973402 | 0 | [317, 83] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200807__707.json | 0.0 | missing | missing | missing | |
| 12103 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20240201_200808__668 | 0 | 0.000199 | 0.470104 | 0 | [317, 27] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200808__668.json | 0.0 | missing | missing | missing | |
| 12104 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20240201_200809__556 | 0 | 0.00031 | 0.913433 | 0 | [317, 101] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200809__556.json | 0.0 | missing | missing | missing | |
| 12105 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-0125 | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20240201_200810__226 | 0 | 0.000346 | 1.21536 | 0 | [317, 125] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-0125/evaluation__JuliaRecapCoTTask__1SHOT__20240201_200810__226.json | 0.0 | missing | missing | missing | |
| 12106 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 6 | 20240201_200758__367 | 5 | 0.000521 | 2.23434 | 2 | [316, 242] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200758__367.json | 95.8333 | missing | missing | missing | |
| 12107 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 6 | 20240201_200801__729 | 2 | 0.0005315 | 2.33754 | 2 | [316, 249] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200801__729.json | 83.3333 | missing | missing | missing | |
| 12108 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 6 | 20240201_200803__314 | 0 | 0.0003935 | 1.97547 | 1 | [316, 157] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200803__314.json | 62.5 | missing | missing | missing | |
| 12109 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | true | true | 6 | 20240201_200805__708 | 0 | 0.000515 | 1.92127 | 1 | [316, 238] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200805__708.json | 62.5 | missing | missing | missing | |
| 12110 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-0125 | JuliaRecapTask | 1SHOT | false | false | 6 | 20240201_200806__889 | 0 | 0.000323 | 0.913192 | 0 | [316, 110] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-0125/evaluation__JuliaRecapTask__1SHOT__20240201_200806__889.json | 0.0 | missing | missing | missing | |
| 12111 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 6 | 20231213_210317__793 | 0 | 0.000541 | 4.76723 | 0 | [55, 243] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231213_210317__793.json | 0.0 | missing | missing | missing | |
| 12112 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | AsIs | 1SHOT | false | false | 6 | 20231225_231602__753 | 0 | 0.000465 | 2.17647 | 0 | [55, 205] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_231602__753.json | 0.0 | missing | missing | missing | |
| 12113 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | AsIs | 1SHOT | true | true | 6 | 20231225_231605__346 | 2 | 0.000503 | 2.67839 | 2 | [55, 224] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231225_231605__346.json | 83.3333 | missing | missing | missing | |
| 12114 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106--optim | AsIs | 1SHOT | false | false | 6 | 20231215_202452__248 | 0 | 0.0 | 3.48661 | 0 | [55, 192] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__AsIs__1SHOT__20231215_202452__248.json | 0.0 | 0.9 | missing | 0.1 | |
| 12115 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 6 | 20231213_210312__649 | 3 | 0.000508 | 4.08865 | 2 | [58, 225] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231213_210312__649.json | 87.5 | missing | missing | missing | |
| 12116 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 6 | 20231225_231558__557 | 4 | 0.000504 | 2.57275 | 2 | [58, 223] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_231558__557.json | 91.6667 | missing | missing | missing | |
| 12117 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 6 | 20231225_231600__702 | 2 | 0.000494 | 2.42094 | 2 | [58, 218] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231225_231600__702.json | 83.3333 | missing | missing | missing | |
| 12118 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 6 | 20231227_210904__393 | 2 | 0.000484 | 2.90817 | 2 | [58, 213] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_210904__393.json | 83.3333 | missing | missing | missing | |
| 12119 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | InJulia | 1SHOT | true | true | 6 | 20231227_210907__667 | 2 | 0.000516 | 2.9363 | 2 | [58, 229] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231227_210907__667.json | 83.3333 | missing | missing | missing | |
| 12120 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106--optim | InJulia | 1SHOT | true | true | 6 | 20231215_202448__625 | 2 | 0.0 | 4.09232 | 2 | [58, 194] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__InJulia__1SHOT__20231215_202448__625.json | 83.3333 | 0.9 | missing | 0.1 | |
| 12121 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231213_210308__887 | 0 | 0.000387 | 3.86626 | 1 | [93, 147] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231213_210308__887.json | 62.5 | missing | missing | missing | |
| 12122 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_231553__595 | 0 | 0.000343 | 1.67681 | 1 | [93, 125] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_231553__595.json | 62.5 | missing | missing | missing | |
| 12123 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_231555__578 | 4 | 0.000387 | 1.78859 | 2 | [93, 147] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231225_231555__578.json | 91.6667 | missing | missing | missing | |
| 12124 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_210858__635 | 5 | 0.000381 | 2.25596 | 2 | [93, 144] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_210858__635.json | 95.8333 | missing | missing | missing | |
| 12125 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_210901__493 | 4 | 0.000417 | 2.35672 | 2 | [93, 162] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231227_210901__493.json | 91.6667 | missing | missing | missing | |
| 12126 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106--optim | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231215_202444__251 | 2 | 0.0 | 3.86221 | 2 | [93, 148] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__JuliaExpertAsk__1SHOT__20231215_202444__251.json | 83.3333 | 0.9 | missing | 0.1 | |
| 12127 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231213_210304__703 | 2 | 0.000398 | 3.74802 | 2 | [162, 118] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231213_210304__703.json | 83.3333 | missing | missing | missing | |
| 12128 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_231549__696 | 2 | 0.00041 | 1.59658 | 2 | [162, 124] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_231549__696.json | 83.3333 | missing | missing | missing | |
| 12129 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_231551__225 | 2 | 0.00038 | 1.42341 | 2 | [162, 109] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231225_231551__225.json | 83.3333 | missing | missing | missing | |
| 12130 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_210854__415 | 2 | 0.000394 | 1.91887 | 2 | [162, 116] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_210854__415.json | 83.3333 | missing | missing | missing | |
| 12131 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_210856__994 | 2 | 0.000404 | 1.65649 | 2 | [162, 121] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231227_210856__994.json | 83.3333 | missing | missing | missing | |
| 12132 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106--optim | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231215_202440__506 | 2 | 0.0 | 3.05422 | 2 | [162, 116] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__JuliaExpertCoTTask__1SHOT__20231215_202440__506.json | 83.3333 | 0.9 | missing | 0.1 | |
| 12133 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231213_210322__171 | 0 | 0.000689 | 3.62672 | 0 | [317, 186] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231213_210322__171.json | 0.0 | missing | missing | missing | |
| 12134 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_231611__659 | 0 | 0.000533 | 1.60105 | 0 | [317, 108] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_231611__659.json | 0.0 | missing | missing | missing | |
| 12135 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_231613__464 | 0 | 0.000501 | 1.2005 | 0 | [317, 92] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231225_231613__464.json | 0.0 | missing | missing | missing | |
| 12136 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231227_210914__818 | 0 | 0.000539 | 1.62802 | 0 | [317, 111] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_210914__818.json | 0.0 | missing | missing | missing | |
| 12137 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231227_210915__918 | 0 | 0.000519 | 1.59895 | 0 | [317, 101] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231227_210915__918.json | 0.0 | missing | missing | missing | |
| 12138 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106--optim | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231215_202456__611 | 0 | 0.0 | 1.97027 | 0 | [317, 105] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__JuliaRecapCoTTask__1SHOT__20231215_202456__611.json | 0.0 | 0.9 | missing | 0.1 | |
| 12139 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | false | false | 6 | 20231213_210319__721 | 0 | 0.000466 | 1.36872 | 0 | [316, 75] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231213_210319__721.json | 0.0 | missing | missing | missing | |
| 12140 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | false | false | 6 | 20231225_231607__712 | 0 | 0.000574 | 1.79995 | 0 | [316, 129] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_231607__712.json | 0.0 | missing | missing | missing | |
| 12141 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_231610__926 | 0 | 0.00078 | 2.763 | 0 | [316, 232] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231225_231610__926.json | 50.0 | missing | missing | missing | |
| 12142 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_210911__475 | 2 | 0.000736 | 3.38914 | 2 | [316, 210] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_210911__475.json | 83.3333 | missing | missing | missing | |
| 12143 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106 | JuliaRecapTask | 1SHOT | false | false | 6 | 20231227_210912__468 | 0 | 0.000544 | 1.60395 | 0 | [316, 114] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231227_210912__468.json | 0.0 | missing | missing | missing | |
| 12144 | Apple-MacBook-Pro-M1 | wrap_string | gpt-3.5-turbo-1106--optim | JuliaRecapTask | 1SHOT | false | false | 6 | 20231215_202454__869 | 0 | 0.0 | 2.16077 | 0 | [316, 139] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.1\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-3.5-turbo-1106/evaluation__JuliaRecapTask__1SHOT__20231215_202454__869.json | 0.0 | 0.9 | missing | 0.1 | |
| 12145 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 6 | 20240201_120744__426 | 0 | 0.01387 | 27.7605 | 1 | [58, 443] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_120744__426.json | 62.5 | missing | missing | missing | |
| 12146 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 6 | 20240201_120817__748 | 6 | 0.01531 | 33.1368 | 2 | [58, 491] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_120817__748.json | 100.0 | missing | missing | missing | |
| 12147 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 6 | 20240201_120854__853 | 6 | 0.01363 | 36.5101 | 2 | [58, 435] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_120854__853.json | 100.0 | missing | missing | missing | |
| 12148 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 6 | 20240201_120930__835 | 6 | 0.01585 | 35.9719 | 2 | [58, 509] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_120930__835.json | 100.0 | missing | missing | missing | |
| 12149 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-0125-preview | InJulia | 1SHOT | true | true | 6 | 20240201_121008__205 | 5 | 0.01543 | 38.1539 | 2 | [58, 495] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-0125-preview/evaluation__InJulia__1SHOT__20240201_121008__205.json | 95.8333 | missing | missing | missing | |
| 12150 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_120326__130 | 6 | 0.00774 | 17.0065 | 2 | [93, 227] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_120326__130.json | 100.0 | missing | missing | missing | |
| 12151 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_120344__363 | 0 | 0.00813 | 18.008 | 1 | [93, 240] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_120344__363.json | 62.5 | missing | missing | missing | |
| 12152 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_120358__269 | 5 | 0.00723 | 13.747 | 2 | [93, 210] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_120358__269.json | 95.8333 | missing | missing | missing | |
| 12153 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_120411__265 | 6 | 0.006 | 12.6517 | 2 | [93, 169] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_120411__265.json | 100.0 | missing | missing | missing | |
| 12154 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-0125-preview | JuliaExpertAsk | 1SHOT | true | true | 6 | 20240201_120430__718 | 5 | 0.00801 | 18.9569 | 2 | [93, 236] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-0125-preview/evaluation__JuliaExpertAsk__1SHOT__20240201_120430__718.json | 95.8333 | missing | missing | missing | |
| 12155 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20240201_115947__241 | 6 | 0.01629 | 34.0513 | 2 | [162, 489] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_115947__241.json | 100.0 | missing | missing | missing | |
| 12156 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20240201_120005__691 | 5 | 0.00945 | 18.8702 | 2 | [162, 261] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_120005__691.json | 95.8333 | missing | missing | missing | |
| 12157 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20240201_120037__429 | 5 | 0.01278 | 31.6794 | 2 | [162, 372] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_120037__429.json | 95.8333 | missing | missing | missing | |
| 12158 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20240201_120119__184 | 5 | 0.01302 | 42.1121 | 2 | [162, 380] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_120119__184.json | 95.8333 | missing | missing | missing | |
| 12159 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-0125-preview | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20240201_120150__530 | 6 | 0.01407 | 30.3799 | 2 | [162, 415] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-0125-preview/evaluation__JuliaExpertCoTTask__1SHOT__20240201_120150__530.json | 100.0 | missing | missing | missing | |
| 12160 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20240201_122008__952 | 4 | 0.01529 | 28.1101 | 2 | [317, 404] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_122008__952.json | 91.6667 | missing | missing | missing | |
| 12161 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20240201_122037__253 | 5 | 0.01763 | 29.6912 | 2 | [317, 482] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_122037__253.json | 95.8333 | missing | missing | missing | |
| 12162 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20240201_122150__255 | 6 | 0.01904 | 72.1651 | 2 | [317, 529] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_122150__255.json | 100.0 | missing | missing | missing | |
| 12163 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20240201_122229__702 | 6 | 0.02132 | 38.8716 | 2 | [317, 605] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_122229__702.json | 100.0 | missing | missing | missing | |
| 12164 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-0125-preview | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20240201_122249__799 | 6 | 0.01289 | 20.5216 | 2 | [317, 324] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-0125-preview/evaluation__JuliaRecapCoTTask__1SHOT__20240201_122249__799.json | 100.0 | missing | missing | missing | |
| 12165 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 6 | 20240201_121355__248 | 5 | 0.01741 | 32.3437 | 2 | [316, 475] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_121355__248.json | 95.8333 | missing | missing | missing | |
| 12166 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 6 | 20240201_121437__635 | 5 | 0.01996 | 41.8433 | 2 | [316, 560] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_121437__635.json | 95.8333 | missing | missing | missing | |
| 12167 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 6 | 20240201_121522__472 | 6 | 0.01705 | 44.9177 | 2 | [316, 463] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_121522__472.json | 100.0 | missing | missing | missing | |
| 12168 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 6 | 20240201_121541__734 | 0 | 0.01573 | 18.9319 | 1 | [316, 419] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_121541__734.json | 62.5 | missing | missing | missing | |
| 12169 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-0125-preview | JuliaRecapTask | 1SHOT | true | true | 6 | 20240201_121618__559 | 6 | 0.01882 | 37.2478 | 2 | [316, 522] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-0125-preview/evaluation__JuliaRecapTask__1SHOT__20240201_121618__559.json | 100.0 | missing | missing | missing | |
| 12170 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 6 | 20231213_210528__506 | 0 | 0.01156 | 32.1831 | 0 | [55, 367] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231213_210528__506.json | 0.0 | missing | missing | missing | |
| 12171 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 6 | 20231225_231753__663 | 0 | 0.0133 | 15.1661 | 0 | [55, 425] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_231753__663.json | 0.0 | missing | missing | missing | |
| 12172 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | AsIs | 1SHOT | false | false | 6 | 20231225_231810__502 | 0 | 0.01351 | 16.6119 | 0 | [55, 432] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231225_231810__502.json | 0.0 | missing | missing | missing | |
| 12173 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview--optim | AsIs | 1SHOT | false | false | 6 | 20231215_202714__226 | 0 | 0.0 | 34.4046 | 0 | [55, 393] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__AsIs__1SHOT__20231215_202714__226.json | 0.0 | 0.1 | missing | 0.9 | |
| 12174 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 6 | 20231213_210455__754 | 6 | 0.0127 | 27.4819 | 2 | [58, 404] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231213_210455__754.json | 100.0 | missing | missing | missing | |
| 12175 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 6 | 20231225_231718__586 | 5 | 0.01273 | 13.6435 | 2 | [58, 405] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_231718__586.json | 95.8333 | missing | missing | missing | |
| 12176 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 6 | 20231225_231738__772 | 6 | 0.01594 | 20.1228 | 2 | [58, 512] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231225_231738__772.json | 100.0 | missing | missing | missing | |
| 12177 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 6 | 20231227_211106__897 | 6 | 0.01555 | 29.8313 | 2 | [58, 499] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_211106__897.json | 100.0 | missing | missing | missing | |
| 12178 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | InJulia | 1SHOT | true | true | 6 | 20231227_211149__956 | 6 | 0.01357 | 42.1856 | 2 | [58, 433] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231227_211149__956.json | 100.0 | missing | missing | missing | |
| 12179 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview--optim | InJulia | 1SHOT | true | true | 6 | 20231215_202640__494 | 5 | 0.0 | 48.7396 | 2 | [58, 380] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__InJulia__1SHOT__20231215_202640__494.json | 95.8333 | 0.1 | missing | 0.9 | |
| 12180 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231213_210428__920 | 6 | 0.00699 | 14.7793 | 2 | [93, 202] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231213_210428__920.json | 100.0 | missing | missing | missing | |
| 12181 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_231655__494 | 6 | 0.00879 | 9.87311 | 2 | [93, 262] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_231655__494.json | 100.0 | missing | missing | missing | |
| 12182 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_231704__732 | 5 | 0.00867 | 8.68428 | 2 | [93, 258] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231225_231704__732.json | 95.8333 | missing | missing | missing | |
| 12183 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_211025__282 | 5 | 0.00705 | 14.702 | 2 | [93, 204] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_211025__282.json | 95.8333 | missing | missing | missing | |
| 12184 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_211036__573 | 4 | 0.00684 | 11.4311 | 2 | [93, 197] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231227_211036__573.json | 91.6667 | missing | missing | missing | |
| 12185 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview--optim | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231215_202551__695 | 6 | 0.0 | 23.1322 | 2 | [93, 178] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__JuliaExpertAsk__1SHOT__20231215_202551__695.json | 100.0 | 0.1 | missing | 0.9 | |
| 12186 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231213_210413__728 | 5 | 0.01368 | 50.6233 | 2 | [162, 402] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231213_210413__728.json | 95.8333 | missing | missing | missing | |
| 12187 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_231627__461 | 6 | 0.01545 | 13.9878 | 2 | [162, 461] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_231627__461.json | 100.0 | missing | missing | missing | |
| 12188 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_231645__187 | 6 | 0.01167 | 18.0291 | 2 | [162, 335] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231225_231645__187.json | 100.0 | missing | missing | missing | |
| 12189 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_210937__605 | 5 | 0.01338 | 21.4452 | 2 | [162, 392] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_210937__605.json | 95.8333 | missing | missing | missing | |
| 12190 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_211010__470 | 5 | 0.01455 | 32.8482 | 2 | [162, 431] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231227_211010__470.json | 95.8333 | missing | missing | missing | |
| 12191 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview--optim | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231215_202528__200 | 2 | 0.0 | 31.3468 | 2 | [162, 340] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__JuliaExpertCoTTask__1SHOT__20231215_202528__200.json | 83.3333 | 0.1 | missing | 0.9 | |
| 12192 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231213_210638__852 | 5 | 0.01241 | 22.034 | 2 | [317, 308] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231213_210638__852.json | 95.8333 | missing | missing | missing | |
| 12193 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_231930__522 | 6 | 0.01361 | 16.2463 | 2 | [317, 348] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_231930__522.json | 100.0 | missing | missing | missing | |
| 12194 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_231944__773 | 5 | 0.01538 | 13.1911 | 2 | [317, 407] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231225_231944__773.json | 95.8333 | missing | missing | missing | |
| 12195 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_211255__699 | 4 | 0.01412 | 32.1304 | 2 | [317, 365] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_211255__699.json | 91.6667 | missing | missing | missing | |
| 12196 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_211325__908 | 6 | 0.0149 | 29.8593 | 2 | [317, 391] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231227_211325__908.json | 100.0 | missing | missing | missing | |
| 12197 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview--optim | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231215_202811__337 | 5 | 0.0 | 26.529 | 2 | [317, 331] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__JuliaRecapCoTTask__1SHOT__20231215_202811__337.json | 95.8333 | 0.1 | missing | 0.9 | |
| 12198 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 6 | 20231213_210616__266 | 6 | 0.01585 | 48.2713 | 2 | [316, 423] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231213_210616__266.json | 100.0 | missing | missing | missing | |
| 12199 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_231853__365 | 6 | 0.0244 | 43.0268 | 2 | [316, 708] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_231853__365.json | 100.0 | missing | missing | missing | |
| 12200 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_231914__500 | 6 | 0.0184 | 20.7224 | 2 | [316, 508] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231225_231914__500.json | 100.0 | missing | missing | missing | |
| 12201 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_211207__335 | 6 | 0.01075 | 18.2847 | 2 | [316, 253] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_211207__335.json | 100.0 | missing | missing | missing | |
| 12202 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_211223__513 | 5 | 0.01066 | 15.9072 | 2 | [316, 250] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231227_211223__513.json | 95.8333 | missing | missing | missing | |
| 12203 | Apple-MacBook-Pro-M1 | wrap_string | gpt-4-1106-preview--optim | JuliaRecapTask | 1SHOT | true | true | 6 | 20231215_202744__590 | 4 | 0.0 | 30.096 | 2 | [316, 400] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.1,\n "top_p": 0.9\n} | PromptingTools.OpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/gpt-4-1106-preview/evaluation__JuliaRecapTask__1SHOT__20231215_202744__590.json | 91.6667 | 0.1 | missing | 0.9 | |
| 12204 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | AsIs | 1SHOT | false | false | 6 | 20231214_090018__870 | 0 | 0.0 | 15.3691 | 0 | [48, 461] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__AsIs__1SHOT__20231214_090018__870.json | 0.0 | missing | missing | missing | |
| 12205 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | AsIs | 1SHOT | false | false | 6 | 20231225_071244__709 | 0 | 0.0 | 11.0051 | 0 | [48, 331] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__AsIs__1SHOT__20231225_071244__709.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12206 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | AsIs | 1SHOT | false | false | 6 | 20231225_071256__825 | 0 | 0.0 | 11.7151 | 0 | [1, 363] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__AsIs__1SHOT__20231225_071256__825.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12207 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | InJulia | 1SHOT | true | false | 6 | 20231214_090003__841 | 0 | 0.0 | 15.5205 | 0 | [65, 461] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__InJulia__1SHOT__20231214_090003__841.json | 25.0 | missing | missing | missing | |
| 12208 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | InJulia | 1SHOT | true | false | 6 | 20231225_071219__523 | 0 | 0.0 | 13.329 | 0 | [65, 394] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__InJulia__1SHOT__20231225_071219__523.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12209 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | InJulia | 1SHOT | false | false | 6 | 20231225_071233__485 | 0 | 0.0 | 13.7704 | 0 | [1, 422] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__InJulia__1SHOT__20231225_071233__485.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12210 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | InJulia | 1SHOT | true | true | 6 | 20231227_092002__780 | 0 | 0.0 | 12.5024 | 0 | [65, 380] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__InJulia__1SHOT__20231227_092002__780.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12211 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231214_085947__426 | 0 | 0.0 | 8.25063 | 2 | [94, 241] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaExpertAsk__1SHOT__20231214_085947__426.json | 75.0 | missing | missing | missing | |
| 12212 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231225_071158__800 | 0 | 0.0 | 9.65799 | 0 | [94, 280] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_071158__800.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12213 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_071206__292 | 0 | 0.0 | 7.50202 | 0 | [1, 234] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_071206__292.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12214 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231227_091950__997 | 0 | 0.0 | 12.9728 | 0 | [94, 386] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaExpertAsk__1SHOT__20231227_091950__997.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12215 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231214_085939__134 | 0 | 0.0 | 15.0628 | 0 | [175, 408] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_085939__134.json | 0.0 | missing | missing | missing | |
| 12216 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_071127__609 | 0 | 0.0 | 24.4218 | 0 | [193, 420] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_071127__609.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12217 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_071148__175 | 0 | 0.0 | 19.1673 | 0 | [1, 548] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_071148__175.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12218 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_091936__904 | 0 | 0.0 | 22.599 | 0 | [193, 441] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_091936__904.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12219 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231214_090108__418 | 0 | 0.0 | 27.0922 | 0 | [11, 718] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_090108__418.json | 50.0 | missing | missing | missing | |
| 12220 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_071354__776 | 0 | 0.0 | 20.4252 | 0 | [11, 549] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_071354__776.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12221 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_071414__243 | 0 | 0.0 | 19.9276 | 0 | [1, 542] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_071414__243.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12222 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231227_092032__994 | 0 | 0.0 | 2.41351 | 0 | [11, 66] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_092032__994.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12223 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaRecapTask | 1SHOT | false | false | 6 | 20231214_090041__682 | 0 | 0.0 | 23.256 | 0 | [365, 548] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaRecapTask__1SHOT__20231214_090041__682.json | 0.0 | missing | missing | missing | |
| 12224 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaRecapTask | 1SHOT | false | false | 6 | 20231225_071311__274 | 0 | 0.0 | 15.3346 | 0 | [365, 337] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_071311__274.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12225 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaRecapTask | 1SHOT | false | false | 6 | 20231225_071334__658 | 0 | 0.0 | 22.4462 | 0 | [1, 605] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_071334__658.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12226 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_092030__850 | 2 | 0.0 | 27.6493 | 2 | [365, 666] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaRecapTask__1SHOT__20231227_092030__850.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12227 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | AsIs | 1SHOT | false | false | 6 | 20231214_090958__356 | 0 | 0.0 | 16.9408 | 0 | [48, 505] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__AsIs__1SHOT__20231214_090958__356.json | 0.0 | missing | missing | missing | |
| 12228 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | AsIs | 1SHOT | false | false | 6 | 20231225_073608__903 | 0 | 0.0 | 5.16632 | 0 | [62, 164] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__AsIs__1SHOT__20231225_073608__903.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12229 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | AsIs | 1SHOT | false | false | 6 | 20231225_073614__495 | 0 | 0.0 | 6.11921 | 0 | [62, 197] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__AsIs__1SHOT__20231225_073614__495.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12230 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | InJulia | 1SHOT | true | true | 6 | 20231214_090941__541 | 0 | 0.0 | 17.3566 | 2 | [65, 512] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__InJulia__1SHOT__20231214_090941__541.json | 75.0 | missing | missing | missing | |
| 12231 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | InJulia | 1SHOT | true | true | 6 | 20231225_073555__338 | 0 | 0.0 | 9.56099 | 1 | [65, 304] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__InJulia__1SHOT__20231225_073555__338.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12232 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | InJulia | 1SHOT | true | true | 6 | 20231225_073603__926 | 0 | 0.0 | 7.14085 | 1 | [65, 225] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__InJulia__1SHOT__20231225_073603__926.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12233 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | InJulia | 1SHOT | true | true | 6 | 20231227_092826__312 | 3 | 0.0 | 12.358 | 2 | [65, 406] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__InJulia__1SHOT__20231227_092826__312.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12234 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231214_090923__979 | 0 | 0.0 | 9.88133 | 0 | [94, 290] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231214_090923__979.json | 0.0 | missing | missing | missing | |
| 12235 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_073535__606 | 0 | 0.0 | 7.82167 | 1 | [104, 243] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_073535__606.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12236 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_073545__999 | 0 | 0.0 | 10.0075 | 0 | [104, 315] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_073545__999.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12237 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_092814__986 | 0 | 0.0 | 9.81063 | 0 | [104, 315] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231227_092814__986.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12238 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231214_090913__635 | 0 | 0.0 | 20.4157 | 0 | [175, 555] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231214_090913__635.json | 0.0 | missing | missing | missing | |
| 12239 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_073518__766 | 0 | 0.0 | 16.0027 | 1 | [185, 289] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_073518__766.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12240 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_073527__936 | 1 | 0.0 | 9.475 | 2 | [185, 283] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_073527__936.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12241 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_092804__440 | 6 | 0.0 | 15.4852 | 2 | [185, 299] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231227_092804__440.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12242 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231214_091039__113 | 0 | 0.0 | 20.7042 | 0 | [11, 560] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231214_091039__113.json | 0.0 | missing | missing | missing | |
| 12243 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_073650__984 | 0 | 0.0 | 13.2226 | 1 | [368, 368] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_073650__984.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12244 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_073701__632 | 4 | 0.0 | 10.8874 | 2 | [368, 299] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_073701__632.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12245 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_092850__886 | 4 | 0.0 | 10.8554 | 2 | [368, 301] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231227_092850__886.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12246 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaRecapTask | 1SHOT | false | false | 6 | 20231214_091018__845 | 0 | 0.0 | 20.3092 | 0 | [365, 472] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaRecapTask__1SHOT__20231214_091018__845.json | 0.0 | missing | missing | missing | |
| 12247 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_073627__132 | 0 | 0.0 | 13.0086 | 1 | [365, 362] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_073627__132.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12248 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_073636__922 | 3 | 0.0 | 8.96252 | 2 | [365, 236] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_073636__922.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12249 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_092839__652 | 0 | 0.0 | 12.9749 | 0 | [365, 368] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaRecapTask__1SHOT__20231227_092839__652.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12250 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 6 | 20231227_183434__412 | 0 | 0.0 | 16.037 | 1 | [65, 311] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_183434__412.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12251 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 6 | 20231227_183454__858 | 0 | 0.0 | 20.4749 | 1 | [65, 398] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_183454__858.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12252 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 6 | 20231227_183511__140 | 4 | 0.0 | 16.1547 | 2 | [65, 313] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_183511__140.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12253 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_183349__811 | 0 | 0.0 | 11.1204 | 0 | [104, 209] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_183349__811.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12254 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_183403__115 | 6 | 0.0 | 14.0422 | 2 | [104, 267] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_183403__115.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12255 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_183418__393 | 6 | 0.0 | 14.6978 | 2 | [104, 280] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_183418__393.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12256 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_183310__968 | 3 | 0.0 | 13.9598 | 2 | [185, 257] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_183310__968.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12257 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_183321__211 | 3 | 0.0 | 10.7731 | 2 | [185, 194] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_183321__211.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12258 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_183337__327 | 3 | 0.0 | 16.4286 | 2 | [185, 305] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_183337__327.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12259 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_183624__576 | 3 | 0.0 | 21.6535 | 2 | [368, 380] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_183624__576.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12260 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_183640__243 | 4 | 0.0 | 16.8115 | 2 | [368, 288] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_183640__243.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12261 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_183659__967 | 4 | 0.0 | 18.2868 | 2 | [368, 316] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_183659__967.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12262 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_183532__404 | 2 | 0.0 | 21.0304 | 2 | [365, 368] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_183532__404.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12263 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_183551__213 | 4 | 0.0 | 18.7041 | 2 | [365, 324] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_183551__213.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12264 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_183602__995 | 0 | 0.0 | 11.0379 | 2 | [365, 177] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_183602__995.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12265 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | AsIs | 1SHOT | false | false | 6 | 20231213_210950__515 | 0 | 0.00283979 | 28.9715 | 0 | [60, 331] | 0.10.0-DEV | 2 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__AsIs__1SHOT__20231213_210950__515.json | 0.0 | missing | missing | missing | |
| 12266 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | AsIs | 1SHOT | false | false | 6 | 20231225_232250__857 | 0 | 0.0029935 | 7.7596 | 0 | [60, 350] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__AsIs__1SHOT__20231225_232250__857.json | 0.0 | missing | missing | missing | |
| 12267 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | AsIs | 1SHOT | false | false | 6 | 20231225_232259__903 | 0 | 0.00368924 | 9.61509 | 0 | [60, 436] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__AsIs__1SHOT__20231225_232259__903.json | 0.0 | missing | missing | missing | |
| 12268 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium--optim | AsIs | 1SHOT | false | false | 6 | 20231215_202937__877 | 0 | 0.0 | 8.32359 | 0 | [60, 378] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__AsIs__1SHOT__20231215_202937__877.json | 0.0 | 0.9 | missing | 0.3 | |
| 12269 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | InJulia | 1SHOT | true | true | 6 | 20231213_210921__567 | 3 | 0.00339801 | 34.6886 | 2 | [63, 399] | 0.10.0-DEV | 2 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__InJulia__1SHOT__20231213_210921__567.json | 87.5 | missing | missing | missing | |
| 12270 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | InJulia | 1SHOT | true | true | 6 | 20231225_232228__307 | 3 | 0.00292879 | 7.59254 | 2 | [63, 341] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__InJulia__1SHOT__20231225_232228__307.json | 87.5 | missing | missing | missing | |
| 12271 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | InJulia | 1SHOT | true | true | 6 | 20231225_232242__160 | 5 | 0.00351127 | 13.8511 | 2 | [63, 413] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__InJulia__1SHOT__20231225_232242__160.json | 95.8333 | missing | missing | missing | |
| 12272 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | InJulia | 1SHOT | true | true | 6 | 20231227_211638__776 | 5 | 0.00506455 | 22.6326 | 2 | [63, 605] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__InJulia__1SHOT__20231227_211638__776.json | 95.8333 | missing | missing | missing | |
| 12273 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | InJulia | 1SHOT | true | true | 6 | 20231227_211656__807 | 3 | 0.0034061 | 17.6023 | 2 | [63, 400] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__InJulia__1SHOT__20231227_211656__807.json | 87.5 | missing | missing | missing | |
| 12274 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium--optim | InJulia | 1SHOT | true | true | 6 | 20231215_202928__185 | 3 | 0.0 | 6.61131 | 2 | [63, 302] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__InJulia__1SHOT__20231215_202928__185.json | 87.5 | 0.9 | missing | 0.3 | |
| 12275 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231213_210846__640 | 6 | 0.00331724 | 34.3947 | 2 | [102, 376] | 0.10.0-DEV | 2 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231213_210846__640.json | 100.0 | missing | missing | missing | |
| 12276 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_232210__932 | 4 | 0.00233835 | 5.68349 | 2 | [102, 255] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_232210__932.json | 91.6667 | missing | missing | missing | |
| 12277 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_232220__551 | 5 | 0.00373792 | 9.52121 | 2 | [102, 428] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231225_232220__551.json | 95.8333 | missing | missing | missing | |
| 12278 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231227_211546__979 | 0 | 0.00279139 | 20.6849 | 0 | [102, 311] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_211546__979.json | 25.0 | missing | missing | missing | |
| 12279 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_211615__435 | 3 | 0.00303409 | 28.944 | 2 | [102, 341] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231227_211615__435.json | 87.5 | missing | missing | missing | |
| 12280 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium--optim | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231215_202922__393 | 3 | 0.0 | 7.47058 | 2 | [102, 337] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__JuliaExpertAsk__1SHOT__20231215_202922__393.json | 87.5 | 0.9 | missing | 0.3 | |
| 12281 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231213_210811__868 | 0 | 0.00433685 | 12.5083 | 1 | [183, 475] | 0.10.0-DEV | 2 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231213_210811__868.json | 62.5 | missing | missing | missing | |
| 12282 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_232155__853 | 0 | 0.00504877 | 12.5667 | 1 | [183, 563] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_232155__853.json | 62.5 | missing | missing | missing | |
| 12283 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_232204__171 | 3 | 0.00368965 | 8.85971 | 2 | [183, 395] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231225_232204__171.json | 87.5 | missing | missing | missing | |
| 12284 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_211506__487 | 4 | 0.00377055 | 9.1233 | 2 | [183, 405] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_211506__487.json | 91.6667 | missing | missing | missing | |
| 12285 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_211525__381 | 4 | 0.00457146 | 18.8342 | 2 | [183, 504] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231227_211525__381.json | 91.6667 | missing | missing | missing | |
| 12286 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium--optim | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231215_202914__803 | 4 | 0.0 | 10.3399 | 2 | [183, 465] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__JuliaExpertCoTTask__1SHOT__20231215_202914__803.json | 91.6667 | 0.9 | missing | 0.3 | |
| 12287 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231213_211102__805 | 0 | 0.00458555 | 42.9241 | 1 | [365, 445] | 0.10.0-DEV | 2 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231213_211102__805.json | 62.5 | missing | missing | missing | |
| 12288 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_232329__464 | 3 | 0.00455319 | 10.4869 | 2 | [365, 441] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_232329__464.json | 87.5 | missing | missing | missing | |
| 12289 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_232357__229 | 0 | 0.00388981 | 27.661 | 1 | [365, 359] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231225_232357__229.json | 62.5 | missing | missing | missing | |
| 12290 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_211746__454 | 3 | 0.00440757 | 21.2618 | 2 | [365, 423] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_211746__454.json | 87.5 | missing | missing | missing | |
| 12291 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_211801__763 | 5 | 0.00345295 | 15.0491 | 2 | [365, 305] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231227_211801__763.json | 95.8333 | missing | missing | missing | |
| 12292 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium--optim | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231215_202958__524 | 6 | 0.0 | 10.7249 | 2 | [365, 400] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__JuliaRecapCoTTask__1SHOT__20231215_202958__524.json | 100.0 | 0.9 | missing | 0.3 | |
| 12293 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 6 | 20231213_211019__625 | 1 | 0.00357429 | 29.2486 | 2 | [362, 321] | 0.10.0-DEV | 2 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231213_211019__625.json | 79.1667 | missing | missing | missing | |
| 12294 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_232309__173 | 5 | 0.00426194 | 9.1688 | 2 | [362, 406] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_232309__173.json | 95.8333 | missing | missing | missing | |
| 12295 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_232318__614 | 5 | 0.00440756 | 9.62477 | 2 | [362, 424] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231225_232318__614.json | 95.8333 | missing | missing | missing | |
| 12296 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_211707__169 | 6 | 0.00482015 | 10.9042 | 2 | [362, 475] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_211707__169.json | 100.0 | missing | missing | missing | |
| 12297 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_211724__111 | 6 | 0.00487678 | 17.4868 | 2 | [362, 482] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231227_211724__111.json | 100.0 | missing | missing | missing | |
| 12298 | Apple-MacBook-Pro-M1 | wrap_string | mistral-medium--optim | JuliaRecapTask | 1SHOT | true | true | 6 | 20231215_202947__695 | 3 | 0.0 | 10.018 | 2 | [362, 445] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-medium/evaluation__JuliaRecapTask__1SHOT__20231215_202947__695.json | 87.5 | 0.9 | missing | 0.3 | |
| 12299 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | AsIs | 1SHOT | true | true | 6 | 20231213_210746__603 | 3 | 0.000796713 | 5.38724 | 2 | [59, 391] | 0.10.0-DEV | 2 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__AsIs__1SHOT__20231213_210746__603.json | 87.5 | missing | missing | missing | |
| 12300 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | AsIs | 1SHOT | true | true | 6 | 20231225_232103__490 | 3 | 0.000771493 | 5.10288 | 2 | [59, 378] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__AsIs__1SHOT__20231225_232103__490.json | 87.5 | missing | missing | missing | |
| 12301 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | AsIs | 1SHOT | true | true | 6 | 20231225_232120__488 | 3 | 0.000779253 | 17.3112 | 2 | [59, 382] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__AsIs__1SHOT__20231225_232120__488.json | 87.5 | missing | missing | missing | |
| 12302 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small--optim | AsIs | 1SHOT | true | true | 6 | 20231215_202852__526 | 1 | 0.0 | 6.6285 | 2 | [59, 500] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__AsIs__1SHOT__20231215_202852__526.json | 79.1667 | 0.9 | missing | 0.3 | |
| 12303 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | InJulia | 1SHOT | true | true | 6 | 20231213_210741__225 | 3 | 0.000755974 | 5.04142 | 2 | [62, 369] | 0.10.0-DEV | 2 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__InJulia__1SHOT__20231213_210741__225.json | 87.5 | missing | missing | missing | |
| 12304 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | InJulia | 1SHOT | true | true | 6 | 20231225_232052__172 | 1 | 0.000994594 | 6.64021 | 0 | [62, 492] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__InJulia__1SHOT__20231225_232052__172.json | 54.1667 | missing | missing | missing | |
| 12305 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | InJulia | 1SHOT | true | true | 6 | 20231225_232057__930 | 2 | 0.000839394 | 5.64589 | 2 | [62, 412] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__InJulia__1SHOT__20231225_232057__930.json | 83.3333 | missing | missing | missing | |
| 12306 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | InJulia | 1SHOT | true | true | 6 | 20231227_211433__143 | 5 | 0.000691954 | 4.46976 | 2 | [62, 336] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__InJulia__1SHOT__20231227_211433__143.json | 95.8333 | missing | missing | missing | |
| 12307 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | InJulia | 1SHOT | true | true | 6 | 20231227_211438__620 | 3 | 0.000781194 | 5.13684 | 2 | [62, 382] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__InJulia__1SHOT__20231227_211438__620.json | 87.5 | missing | missing | missing | |
| 12308 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small--optim | InJulia | 1SHOT | true | true | 6 | 20231215_202845__719 | 3 | 0.0 | 4.88576 | 2 | [62, 373] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__InJulia__1SHOT__20231215_202845__719.json | 87.5 | 0.9 | missing | 0.3 | |
| 12309 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231213_210735__942 | 3 | 0.000654461 | 4.15432 | 2 | [103, 303] | 0.10.0-DEV | 2 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231213_210735__942.json | 87.5 | missing | missing | missing | |
| 12310 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_232041__808 | 0 | 0.000576861 | 3.58469 | 2 | [103, 263] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_232041__808.json | 75.0 | missing | missing | missing | |
| 12311 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_232045__210 | 1 | 0.000592381 | 3.67487 | 2 | [103, 271] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231225_232045__210.json | 79.1667 | missing | missing | missing | |
| 12312 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_211425__644 | 1 | 0.000495381 | 3.02869 | 2 | [103, 221] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_211425__644.json | 79.1667 | missing | missing | missing | |
| 12313 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_211428__998 | 0 | 0.000508961 | 3.17987 | 2 | [103, 228] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231227_211428__998.json | 75.0 | missing | missing | missing | |
| 12314 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small--optim | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231215_202840__581 | 0 | 0.0 | 3.07852 | 2 | [103, 228] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__JuliaExpertAsk__1SHOT__20231215_202840__581.json | 75.0 | 0.9 | missing | 0.3 | |
| 12315 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231213_210731__143 | 3 | 0.000858188 | 5.1898 | 2 | [184, 381] | 0.10.0-DEV | 2 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231213_210731__143.json | 87.5 | missing | missing | missing | |
| 12316 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_232030__312 | 3 | 0.000883408 | 5.37552 | 2 | [184, 394] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_232030__312.json | 87.5 | missing | missing | missing | |
| 12317 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_232037__207 | 1 | 0.00111039 | 7.29641 | 0 | [184, 511] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231225_232037__207.json | 54.1667 | missing | missing | missing | |
| 12318 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_211417__734 | 0 | 0.00113755 | 7.07726 | 0 | [184, 525] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_211417__734.json | 50.0 | missing | missing | missing | |
| 12319 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_211422__629 | 4 | 0.000776708 | 4.61206 | 2 | [184, 339] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231227_211422__629.json | 91.6667 | missing | missing | missing | |
| 12320 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small--optim | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231215_202837__340 | 4 | 0.0 | 5.42161 | 2 | [184, 413] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__JuliaExpertCoTTask__1SHOT__20231215_202837__340.json | 91.6667 | 0.9 | missing | 0.3 | |
| 12321 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231213_210759__460 | 0 | 0.00108458 | 6.03251 | 0 | [369, 436] | 0.10.0-DEV | 2 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231213_210759__460.json | 50.0 | missing | missing | missing | |
| 12322 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_232141__667 | 0 | 0.000908043 | 4.76185 | 0 | [369, 345] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_232141__667.json | 50.0 | missing | missing | missing | |
| 12323 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_232143__454 | 0 | 0.000444383 | 1.55628 | 0 | [369, 106] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231225_232143__454.json | 0.0 | missing | missing | missing | |
| 12324 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_211452__277 | 4 | 0.00118934 | 6.66555 | 2 | [369, 490] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_211452__277.json | 91.6667 | missing | missing | missing | |
| 12325 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_211457__696 | 4 | 0.000972063 | 5.08695 | 2 | [369, 378] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231227_211457__696.json | 91.6667 | missing | missing | missing | |
| 12326 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small--optim | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231215_202904__390 | 5 | 0.0 | 6.9653 | 2 | [369, 521] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__JuliaRecapCoTTask__1SHOT__20231215_202904__390.json | 95.8333 | 0.9 | missing | 0.3 | |
| 12327 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | JuliaRecapTask | 1SHOT | true | true | 6 | 20231213_210753__624 | 0 | 0.00110463 | 6.27091 | 0 | [367, 447] | 0.10.0-DEV | 2 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231213_210753__624.json | 50.0 | missing | missing | missing | |
| 12328 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_232127__331 | 0 | 0.00121327 | 6.87592 | 0 | [367, 503] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_232127__331.json | 50.0 | missing | missing | missing | |
| 12329 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_232136__775 | 1 | 0.00149457 | 8.81971 | 0 | [367, 648] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231225_232136__775.json | 54.1667 | missing | missing | missing | |
| 12330 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | JuliaRecapTask | 1SHOT | false | false | 6 | 20231227_211439__227 | 0 | 0.000336389 | 0.876588 | 0 | [367, 51] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_211439__227.json | 0.0 | missing | missing | missing | |
| 12331 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_211445__793 | 3 | 0.00105225 | 5.73597 | 2 | [367, 420] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231227_211445__793.json | 87.5 | missing | missing | missing | |
| 12332 | Apple-MacBook-Pro-M1 | wrap_string | mistral-small--optim | JuliaRecapTask | 1SHOT | true | true | 6 | 20231215_202857__185 | 0 | 0.0 | 4.6172 | 0 | [367, 340] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-small/evaluation__JuliaRecapTask__1SHOT__20231215_202857__185.json | 50.0 | 0.9 | missing | 0.3 | |
| 12333 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | AsIs | 1SHOT | false | false | 6 | 20231213_210708__690 | 0 | 0.000149596 | 5.76437 | 0 | [59, 312] | 0.10.0-DEV | 2 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__AsIs__1SHOT__20231213_210708__690.json | 0.0 | missing | missing | missing | |
| 12334 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | AsIs | 1SHOT | false | false | 6 | 20231225_232008__344 | 0 | 0.000131929 | 3.08123 | 0 | [59, 273] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__AsIs__1SHOT__20231225_232008__344.json | 0.0 | missing | missing | missing | |
| 12335 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | AsIs | 1SHOT | false | false | 6 | 20231225_232010__552 | 0 | 0.000133288 | 2.42544 | 0 | [59, 276] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__AsIs__1SHOT__20231225_232010__552.json | 0.0 | missing | missing | missing | |
| 12336 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny--optim | AsIs | 1SHOT | false | false | 6 | 20231215_202824__661 | 0 | 0.0 | 2.89339 | 0 | [59, 332] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__AsIs__1SHOT__20231215_202824__661.json | 0.0 | 0.9 | missing | 0.3 | |
| 12337 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | InJulia | 1SHOT | true | false | 6 | 20231213_210702__674 | 0 | 0.000154999 | 5.50245 | 0 | [62, 323] | 0.10.0-DEV | 2 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__InJulia__1SHOT__20231213_210702__674.json | 25.0 | missing | missing | missing | |
| 12338 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | InJulia | 1SHOT | true | true | 6 | 20231225_232001__398 | 3 | 0.000152734 | 2.85368 | 2 | [62, 318] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__InJulia__1SHOT__20231225_232001__398.json | 87.5 | missing | missing | missing | |
| 12339 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | InJulia | 1SHOT | true | false | 6 | 20231225_232004__585 | 0 | 0.00015817 | 2.9099 | 0 | [62, 330] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__InJulia__1SHOT__20231225_232004__585.json | 25.0 | missing | missing | missing | |
| 12340 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | InJulia | 1SHOT | false | false | 6 | 20231227_211348__551 | 0 | 0.00021253 | 3.92017 | 0 | [62, 450] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__InJulia__1SHOT__20231227_211348__551.json | 0.0 | missing | missing | missing | |
| 12341 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | InJulia | 1SHOT | false | false | 6 | 20231227_211351__556 | 0 | 0.000151375 | 2.82894 | 0 | [62, 315] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__InJulia__1SHOT__20231227_211351__556.json | 0.0 | missing | missing | missing | |
| 12342 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny--optim | InJulia | 1SHOT | true | true | 6 | 20231215_202821__150 | 0 | 0.0 | 2.98129 | 2 | [62, 343] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__InJulia__1SHOT__20231215_202821__150.json | 75.0 | 0.9 | missing | 0.3 | |
| 12343 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231213_210657__834 | 0 | 0.000111362 | 4.03144 | 0 | [103, 214] | 0.10.0-DEV | 2 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231213_210657__834.json | 25.0 | missing | missing | missing | |
| 12344 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_231956__122 | 0 | 0.000104114 | 1.73329 | 1 | [103, 198] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_231956__122.json | 62.5 | missing | missing | missing | |
| 12345 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_231958__107 | 0 | 0.000106379 | 1.83967 | 1 | [103, 203] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231225_231958__107.json | 62.5 | missing | missing | missing | |
| 12346 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_211342__731 | 3 | 0.000145337 | 2.58958 | 2 | [103, 289] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_211342__731.json | 87.5 | missing | missing | missing | |
| 12347 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231227_211344__587 | 0 | 0.000100037 | 1.78653 | 0 | [103, 189] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231227_211344__587.json | 0.0 | missing | missing | missing | |
| 12348 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny--optim | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231215_202818__783 | 0 | 0.0 | 1.88347 | 1 | [103, 212] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__JuliaExpertAsk__1SHOT__20231215_202818__783.json | 62.5 | 0.9 | missing | 0.3 | |
| 12349 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231213_210652__206 | 0 | 0.000178421 | 14.1492 | 1 | [184, 337] | 0.10.0-DEV | 2 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231213_210652__206.json | 62.5 | missing | missing | missing | |
| 12350 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231225_231951__775 | 0 | 0.000125873 | 7.1714 | 0 | [184, 221] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_231951__775.json | 0.0 | missing | missing | missing | |
| 12351 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_231954__456 | 0 | 0.000201524 | 3.39092 | 1 | [184, 388] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231225_231954__456.json | 62.5 | missing | missing | missing | |
| 12352 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_211336__243 | 4 | 0.000233687 | 10.5077 | 2 | [184, 459] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_211336__243.json | 91.6667 | missing | missing | missing | |
| 12353 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20231227_211339__991 | 0 | 0.000201071 | 3.41205 | 0 | [184, 387] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231227_211339__991.json | 25.0 | missing | missing | missing | |
| 12354 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny--optim | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231215_202816__504 | 0 | 0.0 | 4.70019 | 1 | [184, 328] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__JuliaExpertCoTTask__1SHOT__20231215_202816__504.json | 62.5 | 0.9 | missing | 0.3 | |
| 12355 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231213_210726__679 | 0 | 0.000283596 | 8.82789 | 1 | [369, 512] | 0.10.0-DEV | 2 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231213_210726__679.json | 62.5 | missing | missing | missing | |
| 12356 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_232021__799 | 1 | 0.000220176 | 3.2673 | 1 | [369, 372] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_232021__799.json | 66.6667 | missing | missing | missing | |
| 12357 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_232024__846 | 3 | 0.000221535 | 3.29981 | 2 | [369, 375] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231225_232024__846.json | 87.5 | missing | missing | missing | |
| 12358 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_211404__688 | 3 | 0.000269553 | 4.43146 | 2 | [369, 481] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_211404__688.json | 87.5 | missing | missing | missing | |
| 12359 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_211410__779 | 0 | 0.00033252 | 5.52003 | 1 | [369, 620] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231227_211410__779.json | 62.5 | missing | missing | missing | |
| 12360 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny--optim | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231215_202831__170 | 3 | 0.0 | 3.36123 | 2 | [369, 373] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__JuliaRecapCoTTask__1SHOT__20231215_202831__170.json | 87.5 | 0.9 | missing | 0.3 | |
| 12361 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 6 | 20231213_210717__199 | 3 | 0.000276521 | 8.75238 | 1 | [367, 497] | 0.10.0-DEV | 2 | 1.1 | {} | MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231213_210717__199.json | 75.0 | missing | missing | missing | |
| 12362 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | JuliaRecapTask | 1SHOT | true | false | 6 | 20231225_232013__879 | 0 | 0.000212648 | 3.13305 | 0 | [367, 356] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_232013__879.json | 25.0 | missing | missing | missing | |
| 12363 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_232017__790 | 0 | 0.0002507 | 3.9139 | 1 | [367, 440] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231225_232017__790.json | 62.5 | missing | missing | missing | |
| 12364 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_211356__337 | 0 | 0.000306872 | 5.21231 | 1 | [367, 564] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_211356__337.json | 62.5 | missing | missing | missing | |
| 12365 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny | JuliaRecapTask | 1SHOT | false | false | 6 | 20231227_211400__716 | 0 | 0.000246623 | 3.87956 | 0 | [367, 431] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231227_211400__716.json | 0.0 | missing | missing | missing | |
| 12366 | Apple-MacBook-Pro-M1 | wrap_string | mistral-tiny--optim | JuliaRecapTask | 1SHOT | true | true | 6 | 20231215_202828__772 | 0 | 0.0 | 3.7035 | 1 | [367, 401] | 0.10.0-DEV | 2 | 1.1 | {\n "temperature": 0.9,\n "top_p": 0.3\n} | PromptingTools.MistralOpenAISchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral-tiny/evaluation__JuliaRecapTask__1SHOT__20231215_202828__772.json | 62.5 | 0.9 | missing | 0.3 | |
| 12367 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | true | true | 6 | 20231225_081000__880 | 0 | 0.0 | 1.31459 | 0 | [58, 25] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_081000__880.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12368 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | AsIs | 1SHOT | false | false | 6 | 20231225_081003__204 | 0 | 0.0 | 2.62059 | 0 | [58, 60] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__AsIs__1SHOT__20231225_081003__204.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12369 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231225_080950__643 | 0 | 0.0 | 12.1903 | 0 | [61, 311] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_080950__643.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12370 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231225_080959__160 | 2 | 0.0 | 8.37088 | 2 | [61, 211] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_080959__160.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12371 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231227_094353__655 | 4 | 0.0 | 9.59407 | 2 | [61, 242] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_094353__655.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12372 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_080928__995 | 0 | 0.0 | 7.08154 | 0 | [102, 169] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_080928__995.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12373 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_080938__546 | 2 | 0.0 | 9.88958 | 2 | [102, 242] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_080938__546.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12374 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_094343__550 | 0 | 0.0 | 8.30971 | 0 | [102, 200] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_094343__550.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12375 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_080909__256 | 0 | 0.0 | 23.7638 | 1 | [183, 428] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_080909__256.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12376 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_080921__505 | 0 | 0.0 | 11.5366 | 0 | [183, 273] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_080921__505.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12377 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_094334__433 | 0 | 0.0 | 24.4784 | 2 | [183, 463] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_094334__433.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12378 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_081045__815 | 0 | 0.0 | 15.3386 | 0 | [369, 338] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_081045__815.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12379 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_081103__252 | 0 | 0.0 | 18.1685 | 0 | [369, 408] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_081103__252.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12380 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231227_094423__546 | 0 | 0.0 | 13.9783 | 0 | [369, 303] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_094423__546.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12381 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 6 | 20231225_081013__139 | 0 | 0.0 | 10.0488 | 0 | [367, 206] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_081013__139.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12382 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_081029__194 | 3 | 0.0 | 16.4772 | 2 | [367, 366] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_081029__194.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12383 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 6 | 20231227_094409__558 | 0 | 0.0 | 16.3317 | 0 | [367, 361] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_094409__558.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12384 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 6 | 20231228_005246__170 | 0 | 0.0 | 9.95555 | 0 | [60, 319] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_005246__170.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12385 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 6 | 20231228_005303__614 | 2 | 0.0 | 16.3505 | 0 | [60, 523] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_005303__614.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12386 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 6 | 20231228_005315__108 | 0 | 0.0 | 11.6737 | 2 | [60, 374] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_005315__108.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12387 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 6 | 20231228_005326__472 | 0 | 0.0 | 11.0985 | 1 | [60, 356] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_005326__472.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12388 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | false | 6 | 20231228_005338__974 | 0 | 0.0 | 11.261 | 0 | [60, 361] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_005338__974.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12389 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231228_005201__607 | 0 | 0.0 | 7.52045 | 0 | [101, 229] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_005201__607.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12390 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231228_005207__272 | 0 | 0.0 | 5.98151 | 0 | [101, 179] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_005207__272.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12391 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231228_005220__472 | 0 | 0.0 | 12.8179 | 0 | [101, 401] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_005220__472.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12392 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231228_005227__487 | 0 | 0.0 | 7.3174 | 1 | [101, 223] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_005227__487.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12393 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231228_005236__658 | 0 | 0.0 | 8.75651 | 0 | [101, 269] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_005236__658.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12394 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231228_005102__808 | 0 | 0.0 | 11.3916 | 0 | [182, 310] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_005102__808.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12395 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231228_005119__558 | 0 | 0.0 | 16.5265 | 0 | [182, 501] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_005119__558.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12396 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231228_005131__999 | 0 | 0.0 | 11.8957 | 1 | [182, 358] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_005131__999.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12397 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231228_005146__785 | 0 | 0.0 | 14.9141 | 1 | [182, 453] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_005146__785.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12398 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231228_005154__871 | 0 | 0.0 | 7.58889 | 1 | [182, 220] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_005154__871.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12399 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231228_005511__904 | 0 | 0.0 | 12.6459 | 1 | [368, 345] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_005511__904.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12400 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231228_005521__630 | 0 | 0.0 | 9.66916 | 0 | [368, 253] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_005521__630.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12401 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231228_005537__699 | 0 | 0.0 | 16.2673 | 0 | [368, 456] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_005537__699.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12402 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231228_005553__172 | 0 | 0.0 | 15.9933 | 0 | [368, 448] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_005553__172.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12403 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231228_005609__437 | 2 | 0.0 | 15.3899 | 2 | [368, 430] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_005609__437.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12404 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 6 | 20231228_005354__863 | 0 | 0.0 | 16.5508 | 1 | [366, 465] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_005354__863.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12405 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 6 | 20231228_005408__622 | 0 | 0.0 | 13.9441 | 0 | [366, 386] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_005408__622.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12406 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 6 | 20231228_005423__246 | 0 | 0.0 | 14.4643 | 1 | [366, 402] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_005423__246.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12407 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 6 | 20231228_005441__352 | 0 | 0.0 | 18.2581 | 0 | [366, 517] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_005441__352.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12408 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 6 | 20231228_005458__632 | 0 | 0.0 | 16.9208 | 1 | [366, 476] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_005458__632.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12409 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | false | false | 6 | 20231228_005845__660 | 0 | 0.0 | 19.2672 | 0 | [60, 487] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_005845__660.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12410 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231228_005900__459 | 0 | 0.0 | 14.7723 | 1 | [60, 374] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_005900__459.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12411 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231228_005917__705 | 0 | 0.0 | 16.7526 | 1 | [60, 424] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_005917__705.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12412 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231228_005940__744 | 2 | 0.0 | 23.7066 | 2 | [60, 598] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_005940__744.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12413 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231228_005954__600 | 0 | 0.0 | 13.8082 | 1 | [60, 349] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_005954__600.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12414 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231228_005745__278 | 0 | 0.0 | 11.7319 | 0 | [101, 287] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_005745__278.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12415 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231228_005754__197 | 0 | 0.0 | 9.85606 | 1 | [101, 239] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_005754__197.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12416 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231228_005808__103 | 0 | 0.0 | 13.2707 | 0 | [101, 326] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_005808__103.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12417 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231228_005818__745 | 0 | 0.0 | 10.092 | 1 | [101, 245] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_005818__745.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12418 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231228_005825__708 | 0 | 0.0 | 7.34317 | 1 | [101, 174] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_005825__708.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12419 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231228_005632__201 | 0 | 0.0 | 22.6391 | 0 | [182, 518] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_005632__201.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12420 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20231228_005643__897 | 0 | 0.0 | 11.2898 | 0 | [182, 265] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_005643__897.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12421 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231228_005657__419 | 0 | 0.0 | 14.5415 | 0 | [182, 347] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_005657__419.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12422 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231228_005717__832 | 0 | 0.0 | 19.3888 | 0 | [182, 468] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_005717__832.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12423 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231228_005733__389 | 0 | 0.0 | 15.9033 | 0 | [182, 381] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_005733__389.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12424 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231228_010144__794 | 0 | 0.0 | 18.7775 | 0 | [368, 419] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_010144__794.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12425 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231228_010204__997 | 0 | 0.0 | 20.5294 | 1 | [368, 462] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_010204__997.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12426 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231228_010224__595 | 3 | 0.0 | 19.9134 | 1 | [368, 447] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_010224__595.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12427 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231228_010242__713 | 0 | 0.0 | 17.8266 | 0 | [368, 396] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_010242__713.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12428 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231228_010300__489 | 0 | 0.0 | 17.3432 | 1 | [368, 384] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_010300__489.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12429 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231228_010010__506 | 0 | 0.0 | 15.4681 | 1 | [366, 338] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_010010__506.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12430 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231228_010032__228 | 0 | 0.0 | 21.862 | 1 | [366, 494] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_010032__228.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12431 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 6 | 20231228_010051__442 | 0 | 0.0 | 19.345 | 0 | [366, 433] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_010051__442.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12432 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231228_010111__893 | 1 | 0.0 | 19.6223 | 2 | [366, 440] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_010111__893.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12433 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231228_010125__817 | 2 | 0.0 | 13.5999 | 2 | [366, 292] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_010125__817.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12434 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | false | false | 6 | 20231226_124651__847 | 0 | 0.0 | 15.2931 | 0 | [57, 278] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_124651__847.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12435 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | AsIs | 1SHOT | false | false | 6 | 20231226_124714__920 | 0 | 0.0 | 23.7053 | 0 | [57, 416] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__AsIs__1SHOT__20231226_124714__920.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12436 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 6 | 20231226_124611__548 | 0 | 0.0 | 21.5196 | 1 | [60, 380] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_124611__548.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12437 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 6 | 20231226_124635__587 | 1 | 0.0 | 24.1213 | 2 | [60, 437] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_124635__587.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12438 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 6 | 20231227_094735__663 | 0 | 0.0 | 25.5407 | 1 | [60, 473] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231227_094735__663.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12439 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231226_124531__662 | 0 | 0.0 | 16.1485 | 1 | [101, 289] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_124531__662.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12440 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231226_124549__932 | 0 | 0.0 | 18.0881 | 0 | [101, 316] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_124549__932.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12441 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_094709__109 | 0 | 0.0 | 13.2257 | 1 | [101, 237] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_094709__109.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12442 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20231226_124450__591 | 0 | 0.0 | 18.7843 | 0 | [182, 328] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_124450__591.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12443 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231226_124515__123 | 0 | 0.0 | 24.8781 | 1 | [182, 432] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_124515__123.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12444 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20231227_094656__898 | 0 | 0.0 | 29.7509 | 0 | [182, 369] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_094656__898.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12445 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231226_124828__348 | 0 | 0.0 | 21.3832 | 1 | [368, 354] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_124828__348.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12446 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231226_124905__901 | 0 | 0.0 | 36.906 | 1 | [368, 628] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_124905__901.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12447 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_094821__168 | 2 | 0.0 | 22.1892 | 2 | [368, 370] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_094821__168.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12448 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | false | false | 6 | 20231226_124740__343 | 0 | 0.0 | 25.5504 | 0 | [366, 418] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_124740__343.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12449 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | false | 6 | 20231226_124806__335 | 0 | 0.0 | 26.3005 | 0 | [366, 439] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_124806__335.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12450 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | false | 6 | 20231227_094759__582 | 0 | 0.0 | 24.1818 | 0 | [366, 406] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_094759__582.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12451 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231227_095221__951 | 0 | 0.0 | 58.0239 | 0 | [65, 343] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_095221__951.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12452 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | false | false | 6 | 20231227_134039__389 | 0 | 0.0 | 28.3166 | 0 | [65, 162] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_134039__389.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12453 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231227_134121__596 | 0 | 0.0 | 41.4799 | 1 | [65, 243] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_134121__596.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12454 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231227_134157__188 | 0 | 0.0 | 36.6099 | 1 | [65, 213] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_134157__188.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12455 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_095123__548 | 0 | 0.0 | 50.2635 | 1 | [104, 290] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_095123__548.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12456 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231227_133815__467 | 0 | 0.0 | 26.5751 | 0 | [104, 146] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_133815__467.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12457 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_133902__672 | 0 | 0.0 | 46.8198 | 0 | [104, 270] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_133902__672.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12458 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231227_134011__752 | 0 | 0.0 | 68.704 | 0 | [104, 402] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_134011__752.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12459 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_095033__822 | 0 | 0.0 | 77.5115 | 1 | [184, 298] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_095033__822.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12460 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_133550__919 | 2 | 0.0 | 65.2941 | 2 | [184, 343] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_133550__919.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12461 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_133655__938 | 0 | 0.0 | 65.4788 | 1 | [184, 370] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_133655__938.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12462 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_133748__319 | 0 | 0.0 | 52.7643 | 1 | [184, 294] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_133748__319.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12463 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_095447__646 | 0 | 0.0 | 65.7757 | 1 | [378, 335] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_095447__646.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12464 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_134626__852 | 2 | 0.0 | 45.0787 | 2 | [378, 212] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_134626__852.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12465 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231227_140032__982 | 0 | 0.0 | 20.2129 | 0 | [378, 5] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_140032__982.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12466 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_140259__869 | 0 | 0.0 | 146.594 | 1 | [378, 769] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_140259__869.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12467 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_095341__346 | 0 | 0.0 | 79.8964 | 0 | [376, 418] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_095341__346.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12468 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 6 | 20231227_134300__781 | 0 | 0.0 | 61.9905 | 0 | [376, 314] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_134300__781.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12469 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_134432__765 | 0 | 0.0 | 92.8684 | 0 | [376, 494] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_134432__765.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12470 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_134540__546 | 4 | 0.0 | 67.3954 | 2 | [376, 344] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_134540__546.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12471 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 6 | 20231225_081247__864 | 0 | 0.0 | 9.44659 | 0 | [66, 235] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231225_081247__864.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12472 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | AsIs | 1SHOT | false | false | 6 | 20231225_081257__822 | 0 | 0.0 | 10.5084 | 0 | [66, 263] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__AsIs__1SHOT__20231225_081257__822.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12473 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231225_081227__255 | 4 | 0.0 | 11.6657 | 2 | [69, 293] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_081227__255.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12474 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231225_081237__394 | 3 | 0.0 | 10.0398 | 2 | [69, 251] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_081237__394.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12475 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231227_094502__526 | 0 | 0.0 | 13.0414 | 0 | [69, 325] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231227_094502__526.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12476 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_081201__330 | 0 | 0.0 | 12.0488 | 0 | [110, 297] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_081201__330.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12477 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_081215__781 | 2 | 0.0 | 13.6557 | 2 | [110, 338] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_081215__781.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12478 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_094449__279 | 0 | 0.0 | 11.2717 | 1 | [110, 262] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_094449__279.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12479 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20231225_081134__640 | 0 | 0.0 | 30.2001 | 0 | [191, 581] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_081134__640.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12480 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_081149__592 | 1 | 0.0 | 15.221 | 2 | [191, 366] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_081149__592.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12481 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_094437__329 | 5 | 0.0 | 14.097 | 2 | [191, 183] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_094437__329.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12482 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_081343__915 | 3 | 0.0 | 17.2608 | 2 | [377, 385] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_081343__915.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12483 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_081358__471 | 4 | 0.0 | 14.8113 | 2 | [377, 324] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_081358__471.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12484 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_094526__441 | 4 | 0.0 | 11.8538 | 2 | [377, 249] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_094526__441.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12485 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_081311__266 | 1 | 0.0 | 13.2143 | 0 | [375, 285] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_081311__266.json | 54.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12486 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_081326__994 | 5 | 0.0 | 14.8795 | 2 | [375, 326] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_081326__994.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12487 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_094514__439 | 3 | 0.0 | 12.2026 | 2 | [375, 257] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_094514__439.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12488 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 6 | 20231214_090158__383 | 0 | 0.0 | 9.81935 | 0 | [48, 299] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231214_090158__383.json | 0.0 | missing | missing | missing | |
| 12489 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 6 | 20231225_071534__219 | 0 | 0.0 | 9.32618 | 0 | [64, 296] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231225_071534__219.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12490 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | AsIs | 1SHOT | false | false | 6 | 20231225_071550__945 | 0 | 0.0 | 16.2239 | 0 | [64, 515] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__AsIs__1SHOT__20231225_071550__945.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12491 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 6 | 20231214_090148__772 | 0 | 0.0 | 20.1072 | 0 | [65, 589] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231214_090148__772.json | 50.0 | missing | missing | missing | |
| 12492 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 6 | 20231225_071514__817 | 2 | 0.0 | 12.2569 | 2 | [67, 384] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_071514__817.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12493 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 6 | 20231225_071524__219 | 4 | 0.0 | 9.71242 | 1 | [67, 304] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_071524__219.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12494 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 6 | 20231227_092121__914 | 0 | 0.0 | 19.2598 | 0 | [67, 618] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231227_092121__914.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12495 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231214_090128__597 | 0 | 0.0 | 6.77809 | 0 | [94, 195] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231214_090128__597.json | 50.0 | missing | missing | missing | |
| 12496 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_071451__346 | 0 | 0.0 | 9.77129 | 2 | [108, 299] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_071451__346.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12497 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_071502__475 | 1 | 0.0 | 11.3287 | 2 | [108, 349] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_071502__475.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12498 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_092102__597 | 2 | 0.0 | 10.1531 | 2 | [108, 320] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231227_092102__597.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12499 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231214_090121__677 | 0 | 0.0 | 12.2 | 0 | [175, 328] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231214_090121__677.json | 0.0 | missing | missing | missing | |
| 12500 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20231225_071430__185 | 0 | 0.0 | 15.8392 | 0 | [189, 308] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_071430__185.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12501 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_071441__417 | 1 | 0.0 | 10.3551 | 2 | [189, 306] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_071441__417.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12502 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_092051__898 | 0 | 0.0 | 19.0355 | 0 | [189, 434] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231227_092051__898.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12503 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231214_090220__255 | 0 | 0.0 | 1.49227 | 0 | [11, 38] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231214_090220__255.json | 0.0 | missing | missing | missing | |
| 12504 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_071646__637 | 5 | 0.0 | 20.0825 | 2 | [375, 565] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_071646__637.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12505 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_071657__106 | 0 | 0.0 | 10.5812 | 0 | [375, 279] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_071657__106.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12506 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_092148__688 | 0 | 0.0 | 13.664 | 2 | [375, 383] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231227_092148__688.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12507 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | false | false | 6 | 20231214_090219__819 | 0 | 0.0 | 21.0518 | 0 | [365, 491] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231214_090219__819.json | 0.0 | missing | missing | missing | |
| 12508 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_071607__834 | 0 | 0.0 | 16.6305 | 1 | [373, 462] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_071607__834.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12509 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_071626__131 | 0 | 0.0 | 19.0875 | 2 | [373, 537] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_071626__131.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12510 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_092135__458 | 2 | 0.0 | 13.3705 | 2 | [373, 374] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231227_092135__458.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12511 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | AsIs | 1SHOT | false | false | 6 | 20231214_091307__556 | 0 | 0.0 | 23.4719 | 0 | [48, 683] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__AsIs__1SHOT__20231214_091307__556.json | 0.0 | missing | missing | missing | |
| 12512 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | AsIs | 1SHOT | false | false | 6 | 20231225_074035__619 | 0 | 0.0 | 26.4536 | 0 | [65, 477] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__AsIs__1SHOT__20231225_074035__619.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12513 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | AsIs | 1SHOT | false | false | 6 | 20231225_074056__842 | 0 | 0.0 | 21.1234 | 0 | [65, 379] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__AsIs__1SHOT__20231225_074056__842.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12514 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | InJulia | 1SHOT | true | true | 6 | 20231214_091243__882 | 0 | 0.0 | 15.5653 | 0 | [65, 462] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__InJulia__1SHOT__20231214_091243__882.json | 50.0 | missing | missing | missing | |
| 12515 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | InJulia | 1SHOT | false | false | 6 | 20231225_073950__106 | 0 | 0.0 | 3.41398 | 0 | [68, 50] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__InJulia__1SHOT__20231225_073950__106.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12516 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | InJulia | 1SHOT | true | true | 6 | 20231225_074008__138 | 6 | 0.0 | 18.4266 | 2 | [68, 331] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__InJulia__1SHOT__20231225_074008__138.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12517 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | InJulia | 1SHOT | true | true | 6 | 20231227_093025__219 | 0 | 0.0 | 23.1359 | 1 | [68, 422] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__InJulia__1SHOT__20231227_093025__219.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12518 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231214_091227__288 | 0 | 0.0 | 7.28997 | 0 | [94, 211] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231214_091227__288.json | 50.0 | missing | missing | missing | |
| 12519 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_073930__197 | 1 | 0.0 | 21.6025 | 2 | [107, 381] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_073930__197.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12520 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231225_073946__761 | 0 | 0.0 | 16.6054 | 0 | [107, 292] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_073946__761.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12521 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231227_093002__199 | 0 | 0.0 | 13.8947 | 0 | [107, 245] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231227_093002__199.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12522 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231214_091220__566 | 0 | 0.0 | 12.3513 | 0 | [175, 332] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231214_091220__566.json | 0.0 | missing | missing | missing | |
| 12523 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231225_073848__120 | 0 | 0.0 | 26.7905 | 0 | [188, 285] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_073848__120.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12524 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231225_073908__684 | 0 | 0.0 | 19.6467 | 0 | [188, 331] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_073908__684.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12525 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231227_092948__469 | 0 | 0.0 | 21.7038 | 0 | [188, 200] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231227_092948__469.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12526 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231214_091351__558 | 0 | 0.0 | 14.0463 | 0 | [11, 389] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231214_091351__558.json | 0.0 | missing | missing | missing | |
| 12527 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_074159__135 | 6 | 0.0 | 36.4683 | 2 | [371, 587] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_074159__135.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12528 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_074205__740 | 0 | 0.0 | 5.64447 | 0 | [371, 47] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_074205__740.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12529 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_093134__692 | 0 | 0.0 | 38.4027 | 2 | [371, 615] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231227_093134__692.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12530 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 6 | 20231214_091337__234 | 0 | 0.0 | 30.0525 | 0 | [365, 715] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231214_091337__234.json | 0.0 | missing | missing | missing | |
| 12531 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 6 | 20231225_074102__245 | 0 | 0.0 | 5.59621 | 0 | [368, 46] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_074102__245.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12532 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 6 | 20231225_074123__673 | 0 | 0.0 | 20.9842 | 0 | [368, 323] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_074123__673.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12533 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_093056__211 | 1 | 0.0 | 29.997 | 2 | [368, 482] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231227_093056__211.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12534 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 6 | 20231225_081529__922 | 0 | 0.0 | 14.6108 | 0 | [54, 565] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231225_081529__922.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12535 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | AsIs | 1SHOT | false | false | 6 | 20231225_081548__631 | 0 | 0.0 | 18.9352 | 0 | [54, 723] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__AsIs__1SHOT__20231225_081548__631.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12536 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 6 | 20231225_081508__278 | 0 | 0.0 | 5.28398 | 0 | [57, 208] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_081508__278.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12537 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 6 | 20231225_081515__110 | 0 | 0.0 | 7.24517 | 0 | [57, 285] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_081515__110.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12538 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 6 | 20231227_094559__602 | 0 | 0.0 | 7.972 | 0 | [57, 310] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231227_094559__602.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12539 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231225_081432__144 | 0 | 0.0 | 19.3651 | 0 | [94, 729] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_081432__144.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12540 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231225_081502__390 | 0 | 0.0 | 29.8562 | 0 | [94, 1085] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_081502__390.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12541 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231227_094551__995 | 0 | 0.0 | 15.2566 | 0 | [94, 575] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_094551__995.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12542 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231225_081409__997 | 0 | 0.0 | 10.5271 | 0 | [173, 248] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_081409__997.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12543 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231225_081413__561 | 0 | 0.0 | 4.53324 | 0 | [173, 159] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_081413__561.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12544 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231227_094536__883 | 0 | 0.0 | 9.37842 | 0 | [173, 210] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_094536__883.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12545 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_081634__229 | 0 | 0.0 | 11.1142 | 0 | [346, 376] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_081634__229.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12546 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_081635__233 | 0 | 0.0 | 1.11885 | 0 | [346, 1] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_081635__233.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12547 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231227_094626__325 | 0 | 0.0 | 15.5198 | 0 | [346, 529] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_094626__325.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12548 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 6 | 20231225_081559__358 | 0 | 0.0 | 10.8354 | 0 | [343, 366] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_081559__358.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12549 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 6 | 20231225_081623__955 | 0 | 0.0 | 23.6878 | 0 | [343, 811] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_081623__955.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12550 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 6 | 20231227_094610__584 | 0 | 0.0 | 11.482 | 0 | [343, 387] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_094610__584.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12551 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 6 | 20231214_091442__232 | 0 | 0.0 | 13.6291 | 0 | [48, 411] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231214_091442__232.json | 0.0 | missing | missing | missing | |
| 12552 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | AsIs | 1SHOT | false | false | 6 | 20231225_074700__578 | 0 | 0.0 | 36.2534 | 0 | [73, 274] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_074700__578.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12553 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | AsIs | 1SHOT | true | true | 6 | 20231225_074741__562 | 5 | 0.0 | 40.9988 | 2 | [73, 308] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__AsIs__1SHOT__20231225_074741__562.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12554 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | InJulia | 1SHOT | false | false | 6 | 20231214_091429__429 | 0 | 0.0 | 13.757 | 0 | [65, 410] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231214_091429__429.json | 0.0 | missing | missing | missing | |
| 12555 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 6 | 20231225_074546__616 | 1 | 0.0 | 39.5885 | 2 | [76, 301] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_074546__616.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12556 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 6 | 20231225_074624__550 | 2 | 0.0 | 38.3352 | 2 | [76, 290] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_074624__550.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12557 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 6 | 20231227_093338__884 | 3 | 0.0 | 36.5689 | 2 | [76, 282] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231227_093338__884.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12558 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231214_091415__431 | 0 | 0.0 | 10.4333 | 0 | [94, 306] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231214_091415__431.json | 50.0 | missing | missing | missing | |
| 12559 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_074419__954 | 6 | 0.0 | 36.1501 | 2 | [115, 270] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_074419__954.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12560 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_074506__800 | 5 | 0.0 | 46.2149 | 2 | [115, 350] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_074506__800.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12561 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_093302__426 | 3 | 0.0 | 30.6901 | 2 | [115, 229] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231227_093302__426.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12562 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231214_091404__148 | 0 | 0.0 | 13.2991 | 0 | [175, 359] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_091404__148.json | 50.0 | missing | missing | missing | |
| 12563 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_074254__420 | 0 | 0.0 | 49.1583 | 1 | [196, 171] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_074254__420.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12564 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_074343__397 | 6 | 0.0 | 48.4383 | 2 | [196, 341] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_074343__397.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12565 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_093231__460 | 5 | 0.0 | 56.6044 | 2 | [196, 249] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_093231__460.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12566 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231214_091524__520 | 0 | 0.0 | 18.3141 | 2 | [11, 500] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_091524__520.json | 75.0 | missing | missing | missing | |
| 12567 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_075003__536 | 3 | 0.0 | 39.0995 | 2 | [379, 241] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_075003__536.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12568 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_075049__767 | 5 | 0.0 | 45.401 | 2 | [379, 289] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_075049__767.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12569 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_093459__953 | 5 | 0.0 | 40.2278 | 2 | [379, 258] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_093459__953.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12570 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | false | false | 6 | 20231214_091506__206 | 0 | 0.0 | 23.3595 | 0 | [365, 550] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231214_091506__206.json | 0.0 | missing | missing | missing | |
| 12571 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_074822__974 | 3 | 0.0 | 40.0888 | 2 | [376, 249] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_074822__974.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12572 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_074923__531 | 5 | 0.0 | 61.4046 | 2 | [376, 412] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_074923__531.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12573 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_093418__751 | 2 | 0.0 | 39.8572 | 2 | [376, 256] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231227_093418__751.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12574 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 6 | 20231225_080701__491 | 0 | 0.0 | 21.0746 | 0 | [66, 358] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231225_080701__491.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12575 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | AsIs | 1SHOT | false | false | 6 | 20231225_080717__235 | 0 | 0.0 | 15.6546 | 0 | [66, 264] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__AsIs__1SHOT__20231225_080717__235.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12576 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231225_080618__303 | 2 | 0.0 | 14.1745 | 2 | [69, 238] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_080618__303.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12577 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | false | false | 6 | 20231225_080640__478 | 0 | 0.0 | 22.2335 | 0 | [69, 378] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_080640__478.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12578 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231227_094228__570 | 3 | 0.0 | 27.734 | 2 | [69, 471] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231227_094228__570.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12579 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231225_080545__819 | 0 | 0.0 | 19.6535 | 0 | [110, 327] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_080545__819.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12580 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_080603__866 | 0 | 0.0 | 17.8643 | 0 | [110, 296] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_080603__866.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12581 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231227_094200__132 | 0 | 0.0 | 19.7321 | 0 | [110, 328] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_094200__132.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12582 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_080511__583 | 0 | 0.0 | 31.3271 | 0 | [191, 315] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_080511__583.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12583 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231225_080525__143 | 0 | 0.0 | 13.1628 | 0 | [191, 200] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_080525__143.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12584 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231227_094140__714 | 0 | 0.0 | 27.9301 | 0 | [191, 306] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_094140__714.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12585 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_080828__194 | 0 | 0.0 | 20.2183 | 2 | [377, 295] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_080828__194.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12586 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_080845__173 | 0 | 0.0 | 17.0302 | 0 | [377, 241] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_080845__173.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12587 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_094310__362 | 0 | 0.0 | 22.5646 | 0 | [377, 333] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_094310__362.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12588 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 6 | 20231225_080745__918 | 0 | 0.0 | 28.1364 | 0 | [375, 426] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_080745__918.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12589 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_080808__159 | 0 | 0.0 | 22.3829 | 0 | [375, 331] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_080808__159.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12590 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_094247__535 | 0 | 0.0 | 19.3206 | 1 | [375, 279] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_094247__535.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12591 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | AsIs | 1SHOT | false | false | 6 | 20231214_091133__855 | 0 | 0.0 | 11.8913 | 0 | [48, 360] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__AsIs__1SHOT__20231214_091133__855.json | 0.0 | missing | missing | missing | |
| 12592 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | AsIs | 1SHOT | false | false | 6 | 20231225_073747__873 | 0 | 0.0 | 5.67258 | 0 | [67, 322] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_073747__873.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12593 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | AsIs | 1SHOT | false | false | 6 | 20231225_073755__293 | 0 | 0.0 | 7.99553 | 0 | [67, 449] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__AsIs__1SHOT__20231225_073755__293.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12594 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | InJulia | 1SHOT | false | false | 6 | 20231214_091121__502 | 0 | 0.0 | 16.0864 | 0 | [65, 477] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__InJulia__1SHOT__20231214_091121__502.json | 0.0 | missing | missing | missing | |
| 12595 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | InJulia | 1SHOT | false | false | 6 | 20231225_073733__653 | 0 | 0.0 | 5.00542 | 0 | [70, 282] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_073733__653.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12596 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | InJulia | 1SHOT | false | false | 6 | 20231225_073741__856 | 0 | 0.0 | 7.45959 | 0 | [70, 415] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_073741__856.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12597 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | InJulia | 1SHOT | false | false | 6 | 20231227_092915__423 | 0 | 0.0 | 9.95984 | 0 | [70, 554] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__InJulia__1SHOT__20231227_092915__423.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12598 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231214_091105__300 | 0 | 0.0 | 12.3637 | 0 | [94, 361] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231214_091105__300.json | 25.0 | missing | missing | missing | |
| 12599 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231225_073723__375 | 0 | 0.0 | 7.03885 | 0 | [107, 386] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_073723__375.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12600 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_073728__709 | 1 | 0.0 | 5.11838 | 0 | [107, 281] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_073728__709.json | 54.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12601 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231227_092905__222 | 0 | 0.0 | 7.44119 | 0 | [107, 412] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231227_092905__222.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12602 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231214_091052__561 | 0 | 0.0 | 13.525 | 0 | [175, 366] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231214_091052__561.json | 50.0 | missing | missing | missing | |
| 12603 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231225_073708__325 | 0 | 0.0 | 7.13576 | 0 | [184, 216] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_073708__325.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12604 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231225_073716__959 | 0 | 0.0 | 8.27553 | 0 | [184, 434] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_073716__959.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12605 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231227_092857__746 | 0 | 0.0 | 6.95043 | 0 | [184, 225] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231227_092857__746.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12606 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231214_091208__381 | 0 | 0.0 | 14.857 | 0 | [11, 409] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231214_091208__381.json | 0.0 | missing | missing | missing | |
| 12607 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_073817__638 | 0 | 0.0 | 6.37316 | 0 | [357, 285] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_073817__638.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12608 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_073822__531 | 0 | 0.0 | 4.51973 | 0 | [357, 190] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_073822__531.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12609 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_092927__312 | 1 | 0.0 | 4.36126 | 0 | [357, 183] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231227_092927__312.json | 54.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12610 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 6 | 20231214_091153__603 | 0 | 0.0 | 20.0677 | 2 | [365, 466] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231214_091153__603.json | 75.0 | missing | missing | missing | |
| 12611 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 6 | 20231225_073804__811 | 0 | 0.0 | 9.03437 | 0 | [355, 421] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_073804__811.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12612 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 6 | 20231225_073811__370 | 0 | 0.0 | 6.92399 | 0 | [355, 316] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_073811__370.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12613 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 6 | 20231227_092922__456 | 0 | 0.0 | 7.35213 | 0 | [355, 339] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231227_092922__456.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12614 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | AsIs | 1SHOT | false | false | 6 | 20231214_090311__420 | 0 | 0.0 | 12.6412 | 0 | [48, 382] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__AsIs__1SHOT__20231214_090311__420.json | 0.0 | missing | missing | missing | |
| 12615 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | AsIs | 1SHOT | false | false | 6 | 20231225_071818__317 | 0 | 0.0 | 11.8826 | 0 | [66, 373] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__AsIs__1SHOT__20231225_071818__317.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12616 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | AsIs | 1SHOT | false | false | 6 | 20231225_071827__555 | 0 | 0.0 | 9.54715 | 0 | [66, 299] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__AsIs__1SHOT__20231225_071827__555.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12617 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | InJulia | 1SHOT | true | true | 6 | 20231214_090258__562 | 0 | 0.0 | 15.9127 | 0 | [65, 471] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__InJulia__1SHOT__20231214_090258__562.json | 50.0 | missing | missing | missing | |
| 12618 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | InJulia | 1SHOT | true | true | 6 | 20231225_071754__157 | 1 | 0.0 | 17.2633 | 2 | [69, 542] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_071754__157.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12619 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | InJulia | 1SHOT | true | true | 6 | 20231225_071806__528 | 0 | 0.0 | 11.1452 | 2 | [69, 349] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_071806__528.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12620 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | InJulia | 1SHOT | true | true | 6 | 20231227_092224__211 | 1 | 0.0 | 11.9393 | 2 | [69, 384] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__InJulia__1SHOT__20231227_092224__211.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12621 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231214_090242__299 | 0 | 0.0 | 10.9448 | 0 | [94, 321] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231214_090242__299.json | 50.0 | missing | missing | missing | |
| 12622 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_071730__267 | 4 | 0.0 | 10.2044 | 2 | [110, 313] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_071730__267.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12623 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_071737__519 | 4 | 0.0 | 7.12749 | 2 | [110, 214] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_071737__519.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12624 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_092212__323 | 2 | 0.0 | 10.5519 | 2 | [110, 332] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231227_092212__323.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12625 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231214_090231__677 | 0 | 0.0 | 11.0101 | 0 | [175, 294] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231214_090231__677.json | 0.0 | missing | missing | missing | |
| 12626 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_071712__662 | 0 | 0.0 | 14.6829 | 0 | [191, 262] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_071712__662.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12627 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_071719__573 | 0 | 0.0 | 7.10844 | 0 | [191, 202] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_071719__573.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12628 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231227_092202__743 | 0 | 0.0 | 13.3321 | 0 | [191, 243] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231227_092202__743.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12629 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231214_090358__292 | 0 | 0.0 | 22.5214 | 0 | [11, 606] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231214_090358__292.json | 0.0 | missing | missing | missing | |
| 12630 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_071903__103 | 0 | 0.0 | 13.46 | 0 | [377, 366] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_071903__103.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12631 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_071919__305 | 0 | 0.0 | 15.6301 | 0 | [377, 431] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_071919__305.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12632 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231227_092250__349 | 0 | 0.0 | 10.6243 | 0 | [377, 287] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231227_092250__349.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12633 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 6 | 20231214_090336__713 | 0 | 0.0 | 24.5493 | 0 | [365, 580] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231214_090336__713.json | 50.0 | missing | missing | missing | |
| 12634 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_071839__494 | 0 | 0.0 | 11.8157 | 2 | [375, 316] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_071839__494.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12635 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_071849__202 | 0 | 0.0 | 9.8079 | 0 | [375, 255] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_071849__202.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12636 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_092240__441 | 3 | 0.0 | 15.0611 | 2 | [375, 426] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231227_092240__441.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12637 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | AsIs | 1SHOT | false | false | 6 | 20231214_090502__792 | 0 | 0.0 | 12.4645 | 0 | [48, 376] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__AsIs__1SHOT__20231214_090502__792.json | 0.0 | missing | missing | missing | |
| 12638 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | AsIs | 1SHOT | false | false | 6 | 20231225_072508__389 | 0 | 0.0 | 84.1483 | 0 | [62, 624] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__AsIs__1SHOT__20231225_072508__389.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12639 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | AsIs | 1SHOT | false | false | 6 | 20231225_072611__913 | 0 | 0.0 | 62.9152 | 0 | [62, 469] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__AsIs__1SHOT__20231225_072611__913.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12640 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | InJulia | 1SHOT | true | true | 6 | 20231214_090449__830 | 0 | 0.0 | 13.2438 | 0 | [65, 395] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__InJulia__1SHOT__20231214_090449__830.json | 50.0 | missing | missing | missing | |
| 12641 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | InJulia | 1SHOT | true | true | 6 | 20231225_072251__372 | 0 | 0.0 | 60.5904 | 1 | [65, 447] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_072251__372.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12642 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | InJulia | 1SHOT | true | true | 6 | 20231225_072343__328 | 0 | 0.0 | 51.4461 | 1 | [65, 379] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_072343__328.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12643 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | InJulia | 1SHOT | true | true | 6 | 20231227_092511__711 | 0 | 0.0 | 59.478 | 1 | [65, 449] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__InJulia__1SHOT__20231227_092511__711.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12644 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231214_090436__213 | 0 | 0.0 | 9.96774 | 0 | [94, 291] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231214_090436__213.json | 0.0 | missing | missing | missing | |
| 12645 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_072121__116 | 4 | 0.0 | 36.9075 | 2 | [104, 263] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_072121__116.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12646 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_072151__268 | 0 | 0.0 | 29.434 | 1 | [104, 206] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_072151__268.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12647 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_092411__512 | 1 | 0.0 | 29.0728 | 2 | [104, 208] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231227_092411__512.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 12648 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231214_090426__402 | 1 | 0.0 | 27.5517 | 0 | [175, 739] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231214_090426__402.json | 54.1667 | missing | missing | missing | |
| 12649 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_072015__302 | 0 | 0.0 | 55.6793 | 1 | [184, 203] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_072015__302.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12650 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_072044__105 | 5 | 0.0 | 28.3155 | 2 | [184, 186] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_072044__105.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 12651 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20231227_092342__607 | 0 | 0.0 | 51.8043 | 0 | [184, 200] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231227_092342__607.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12652 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231214_090547__681 | 0 | 0.0 | 20.2949 | 2 | [11, 549] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231214_090547__681.json | 75.0 | missing | missing | missing | |
| 12653 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_073013__766 | 0 | 0.0 | 81.5425 | 1 | [378, 528] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_073013__766.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12654 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_073023__255 | 0 | 0.0 | 9.98316 | 0 | [378, 15] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_073023__255.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12655 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_092640__978 | 0 | 0.0 | 43.1922 | 0 | [378, 269] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231227_092640__978.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12656 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 6 | 20231214_090527__766 | 0 | 0.0 | 25.2213 | 2 | [365, 596] | 0.10.0-DEV | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231214_090527__766.json | 75.0 | missing | missing | missing | |
| 12657 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_072708__873 | 0 | 0.0 | 57.2056 | 0 | [376, 363] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_072708__873.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12658 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_072851__778 | 0 | 0.0 | 103.232 | 1 | [376, 667] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_072851__778.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 12659 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 6 | 20231227_092557__171 | 0 | 0.0 | 45.4291 | 0 | [376, 286] | 0.10.0-DEV | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231227_092557__171.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing |
Overview of Prompt Templates
We've added an "AsIs" prompt template, which is just the raw task definition (nothing added). As you can see below, it's pretty bad, because the models fail to detect from the context that they should produce Julia code. In short, always use a prompt template, even if it's just a simple one.
Show scatter plot elapsed / score, where prompts are separated by different colors.
fig = @chain df begin
@aside local xlims = quantile(df.elapsed_seconds, [0.01, 0.99])
@rsubset !occursin("--optim", :model)
@by [:model, :prompt_label] begin
:elapsed = mean(:elapsed_seconds)
:elapsed_median = median(:elapsed_seconds)
:score = mean(:score)
:score_median = median(:score)
:cnt = $nrow
end
data(_) * mapping(:elapsed => "Avg. Elapsed Time (s)",
:score => "Avg. Score (Max 100 pts)",
color = :prompt_label => "Prompt")
draw(; figure = (size = (600, 600),),
axis = (xticklabelrotation = 45,
title = "Elapsed Time vs Score by Prompt [PRELIMINARY]",
limits = (xlims..., nothing, nothing)),
palettes = (; color = Makie.ColorSchemes.tab20.colors))
end
SAVE_PLOTS && save("assets/elapsed-vs-score-scatter-prompts.png", fig)
figA few learnings so far:
- Never use the "AsIs" prompt (ie, raw task definition). ALWAYS add some context around the language, situation, etc.
- Even a simple "In Julia, answer XYZ" prompt can be quite effective. Note that the bigger prompts ("CoT" stands for Chain of Thought) might be confusing the smaller models, hence why this prompt is so effective on average.
Table:
output = @chain df begin
@by [:prompt_label] begin
:elapsed = mean(:elapsed_seconds)
:elapsed_median = median(:elapsed_seconds)
:score = mean(:score)
:score_median = median(:score)
end
transform(_, names(_, Number) .=> ByRow(x -> round(x, digits = 1)), renamecols = false)
@orderby -:score
rename("prompt_label" => "Prompt Template",
"score" => "Avg. Score (Max 100 pts)",
"elapsed_median" => "Elapsed (s, median)",
"elapsed" => "Elapsed (s, average)",
"score_median" => "Median Score (Max 100 pts)")
end
# markdown_table(output, String) |> clipboard
markdown_table(output)| Prompt Template | Elapsed (s, average) | Elapsed (s, median) | Avg. Score (Max 100 pts) | Median Score (Max 100 pts) |
|---|---|---|---|---|
| InJulia | 15.2 | 10.7 | 49.9 | 50.0 |
| JuliaExpertAsk | 10.4 | 6.9 | 46.7 | 50.0 |
| JuliaRecapTask | 18.7 | 14.0 | 45.9 | 50.0 |
| JuliaExpertCoTTask | 17.1 | 12.7 | 42.0 | 50.0 |
| JuliaRecapCoTTask | 17.8 | 13.2 | 41.2 | 50.0 |
| AsIs | 36.3 | 11.2 | 9.8 | 0.0 |
This page was generated using Literate.jl.